Pyramid Stereo Matching Network

Pyramid Stereo Matching Network

23 Mar 2018 | Jia-Ren Chang, Yong-Sheng Chen
The paper introduces PSMNet, a novel pyramid stereo matching network designed to improve depth estimation from stereo images. PSMNet addresses the challenge of finding accurate correspondences in ill-posed regions by incorporating global context information. The network consists of two main modules: spatial pyramid pooling (SPP) and a 3D convolutional neural network (3D CNN). The SPP module aggregates context information at different scales and locations to form a cost volume, while the 3D CNN learns to regularize this cost volume using stacked hourglass networks with intermediate supervision. The proposed method was evaluated on several benchmark datasets, including KITTI 2012 and 2015, achieving state-of-the-art accuracy. PSMNet ranked first on the KITTI leaderboard before March 18, 2018, demonstrating its effectiveness in reducing errors in ill-posed regions. The paper also discusses related work and provides a detailed architecture description, experimental setup, and qualitative results.The paper introduces PSMNet, a novel pyramid stereo matching network designed to improve depth estimation from stereo images. PSMNet addresses the challenge of finding accurate correspondences in ill-posed regions by incorporating global context information. The network consists of two main modules: spatial pyramid pooling (SPP) and a 3D convolutional neural network (3D CNN). The SPP module aggregates context information at different scales and locations to form a cost volume, while the 3D CNN learns to regularize this cost volume using stacked hourglass networks with intermediate supervision. The proposed method was evaluated on several benchmark datasets, including KITTI 2012 and 2015, achieving state-of-the-art accuracy. PSMNet ranked first on the KITTI leaderboard before March 18, 2018, demonstrating its effectiveness in reducing errors in ill-posed regions. The paper also discusses related work and provides a detailed architecture description, experimental setup, and qualitative results.
Reach us at info@study.space