Skip to main content

Research Repository

Advanced Search

DurLAR: A High-Fidelity 128-Channel LiDAR Dataset with Panoramic Ambient and Reflectivity Imagery for Multi-Modal Autonomous Driving Applications

Li, Li; Ismail, Khalid N.; Shum, Hubert P.H.; Breckon, Toby P.

DurLAR: A High-Fidelity 128-Channel LiDAR Dataset with Panoramic Ambient and Reflectivity Imagery for Multi-Modal Autonomous Driving Applications Thumbnail


Authors

Li Li li.li4@durham.ac.uk
PGR Student Doctor of Philosophy



Contributors

Li Li li.li4@durham.ac.uk
Other

Abstract

We present DurLAR, a high-fidelity 128-channel 3D LiDAR dataset with panoramic ambient (near infrared) and reflectivity imagery, as well as a sample benchmark task using depth estimation for autonomous driving applications. Our driving platform is equipped with a high resolution 128 channel LiDAR, a 2MPix stereo camera, a lux meter and a GNSS/INS system. Ambient and reflectivity images are made available along with the LiDAR point clouds to facilitate multi-modal use of concurrent ambient and reflectivity scene information. Leveraging DurLAR, with a resolution exceeding that of prior benchmarks, we consider the task of monocular depth estimation and use this increased availability of higher resolution, yet sparse ground truth scene depth information to propose a novel joint supervised/self-supervised loss formulation. We compare performance over both our new DurLAR dataset, the established KITTI benchmark and the Cityscapes dataset. Our evaluation shows our joint use supervised and self-supervised loss terms, enabled via the superior ground truth resolution and availability within DurLAR improves the quantitative and qualitative performance of leading contemporary monocular depth estimation approaches (RMSE = 3.639, SqRel = 0.936).

Citation

Li, L., Ismail, K. N., Shum, H. P., & Breckon, T. P. (2021, December). DurLAR: A High-Fidelity 128-Channel LiDAR Dataset with Panoramic Ambient and Reflectivity Imagery for Multi-Modal Autonomous Driving Applications. Presented at International Conference on 3D Vision, Surrey / Online

Presentation Conference Type Conference Paper (published)
Conference Name International Conference on 3D Vision
Start Date Dec 1, 2021
End Date Dec 3, 2021
Publication Date 2021-12
Deposit Date Oct 25, 2021
Publicly Available Date Dec 4, 2021
Pages 1227-1237
DOI https://doi.org/10.1109/3dv53792.2021.00130
Public URL https://durham-repository.worktribe.com/output/1138941
Publisher URL https://doi.ieeecomputersociety.org/10.1109/3DV53792.2021.00130

Files






You might also like



Downloadable Citations