RGB-L: Enhancing Indirect Visual SLAM using LiDAR-based Dense Depth Maps

December 05, 2022 ยท Declared Dead ยท ๐Ÿ› 2023 3rd International Conference on Computer, Control and Robotics (ICCCR)

๐Ÿ’€ CAUSE OF DEATH: 404 Not Found
Code link is broken/dead
Authors Florian Sauerbeck, Benjamin Obermeier, Martin Rudolph, Johannes Betz arXiv ID 2212.02085 Category cs.RO: Robotics Citations 9 Venue 2023 3rd International Conference on Computer, Control and Robotics (ICCCR) Repository https://github.com/TUMFTM/ORB Last Checked 2 months ago
Abstract
In this paper, we present a novel method for integrating 3D LiDAR depth measurements into the existing ORB-SLAM3 by building upon the RGB-D mode. We propose and compare two methods of depth map generation: conventional computer vision methods, namely an inverse dilation operation, and a supervised deep learning-based approach. We integrate the former directly into the ORB-SLAM3 framework by adding a so-called RGB-L (LiDAR) mode that directly reads LiDAR point clouds. The proposed methods are evaluated on the KITTI Odometry dataset and compared to each other and the standard ORB-SLAM3 stereo method. We demonstrate that, depending on the environment, advantages in trajectory accuracy and robustness can be achieved. Furthermore, we demonstrate that the runtime of the ORB-SLAM3 algorithm can be reduced by more than 40 % compared to the stereo mode. The related code for the ORB-SLAM3 RGB-L mode will be available as open-source software under https://github.com/TUMFTM/ORB SLAM3 RGBL.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Robotics

Died the same way โ€” ๐Ÿ’€ 404 Not Found