Particle Filter SLAM for Vehicle Localization

Particle Filter SLAM for Vehicle Localization

2024 | LIU, Tianrui; XU, Changxin; QIAO, Yuxin; JIANG, Chufeng; YU, Jiqiang
This paper presents a Particle Filter SLAM approach for vehicle localization, integrating encoded data, fiber optic gyro (FOG) information, and lidar technology to achieve precise motion estimation and environmental perception. The method addresses the challenges of simultaneous localization and mapping (SLAM) in robotics, where accurate mapping depends on reliable robot location estimation and vice versa. The Particle Filter SLAM framework combines sensor data streams to estimate the robot's position and orientation in an unknown environment. The motion model uses encoder and FOG data to estimate robot motion, while the observation model uses lidar data to provide insights into surrounding obstacles. The framework employs a Bayes filter, which includes an updated step to correct robot pose using scan-grid correlation and a predicted step to estimate robot trajectory using differential drive motion model. The mapping process involves transforming lidar points from the lidar frame to the world frame and updating the occupancy map log likelihood. The texture map is generated using RGB values and disparity map point clouds. The technical approach includes lidar scans, differential-drive model, and probabilistic occupancy grid mapping. The experiments demonstrate the effectiveness of the Particle Filter SLAM algorithm in real-world scenarios, producing satisfactory maps and localization results. The study contributes to the field of robotics by showing the practicality of Particle Filter SLAM in addressing real-world challenges. The authors acknowledge the computational demands of SLAM and emphasize the potential of this approach for applications in autonomous navigation and robotics.This paper presents a Particle Filter SLAM approach for vehicle localization, integrating encoded data, fiber optic gyro (FOG) information, and lidar technology to achieve precise motion estimation and environmental perception. The method addresses the challenges of simultaneous localization and mapping (SLAM) in robotics, where accurate mapping depends on reliable robot location estimation and vice versa. The Particle Filter SLAM framework combines sensor data streams to estimate the robot's position and orientation in an unknown environment. The motion model uses encoder and FOG data to estimate robot motion, while the observation model uses lidar data to provide insights into surrounding obstacles. The framework employs a Bayes filter, which includes an updated step to correct robot pose using scan-grid correlation and a predicted step to estimate robot trajectory using differential drive motion model. The mapping process involves transforming lidar points from the lidar frame to the world frame and updating the occupancy map log likelihood. The texture map is generated using RGB values and disparity map point clouds. The technical approach includes lidar scans, differential-drive model, and probabilistic occupancy grid mapping. The experiments demonstrate the effectiveness of the Particle Filter SLAM algorithm in real-world scenarios, producing satisfactory maps and localization results. The study contributes to the field of robotics by showing the practicality of Particle Filter SLAM in addressing real-world challenges. The authors acknowledge the computational demands of SLAM and emphasize the potential of this approach for applications in autonomous navigation and robotics.
Reach us at info@study.space
[slides] Particle Filter SLAM for Vehicle Localization | StudySpace