Download file
Download file
Download file
3 files

Motion-based Lidar-camera Calibration via Cross-modality Structure Consistency

Download all (6.57 MB)
posted on 2023-09-05, 15:56 authored by Ni OuNi Ou

Lidar and cameras serve as essential sensors for automated vehicles and intelligent robots, and they are frequently fused in complicated tasks. Precise extrinsic calibration is the prerequisite of Lidar-camera fusion. Hand-eye calibration is almost the most commonly used targetless calibration approach. This paper presents a particular degeneration problem of hand-eye calibration when sensor motions lack rotation. This context is common for ground vehicles, especially those traveling on urban roads, leading to a significant deterioration in translational calibration performance. To address this problem, we propose a novel motion-based Lidar-camera calibration framework based on cross-modality structure consistency. It is globally convergent within the specified search range and can achieve satisfactory translation calibration accuracy in degenerate scenarios. To verify the effectiveness of our framework, we compare its performance to one motion-based method and two appearance-based methods using six Lidar-camera data sequences from the KITTI dataset. Additionally, an ablation study is conducted to demonstrate the effectiveness of each module within our framework.

Our codes will be public on github after acceptance.


National Key Research and Development Program of China under Grant 2019YFC1511401

National Natural Science Foundation of China under Grant 62173038

National Natural Science Foundation of China under Grant 61773060


Email Address of Submitting Author

ORCID of Submitting Author


Submitting Author's Institution

Beijing Institute of Technology

Submitting Author's Country

  • China

Usage metrics