作者:
Liu, HuiZhang, XueboJiang, JingqiNankai Univ
Inst Robot & Automat Informat Syst Coll Artificial Intelligence Tianjin 300350 Peoples R China Nankai Univ
Tianjin Key Lab Intelligent Robot Tianjin 300350 Peoples R China
Sensor calibration is one of the basic tasks for a multimodal sensing system. In this paper, we present a two-stage spatiotemporal calibration method for a common-used sensor suite, i.e. the LiDAR-IMU-camera sensor co...
详细信息
ISBN:
(纸本)9781665478960
Sensor calibration is one of the basic tasks for a multimodal sensing system. In this paper, we present a two-stage spatiotemporal calibration method for a common-used sensor suite, i.e. the LiDAR-IMU-camera sensor combination. In the first stage, the proposed method combines correlation analysis with hand-eye calibration to acquire the initial values of the time offset and the extrinsic rotation parameters for IMU-LiDAR and IMU-camera respectively, and refines the time offset through aligning the rotational trajectories. In the second stage, continuous-time batch optimization framework is employed to jointly estimate the extrinsic parameters of IMU-LiDAR and the trajectory of the IMU, and the optimized trajectory is utilized to calibrate the IMU-camera extrinsic parameters. In addition, the data association scheme is designed according to the modality of the LiDAR and the camera rather than the assistance of artificial targets. We validate the proposed method on both public dataset and self-record data. Experimental results show the feasibility and the performance of the proposed method through comparision with the state-of-the-art calibration methods for IMU-LiDAR and IMU-camera calibration.
Sensor calibration is one of the basic tasks for a multimodal sensing *** this paper,we present a two-stage spatiotemporal calibration method for a common-used sensor suite,*** LiDAR-IMU-camera sensor *** the first st...
详细信息
Sensor calibration is one of the basic tasks for a multimodal sensing *** this paper,we present a two-stage spatiotemporal calibration method for a common-used sensor suite,*** LiDAR-IMU-camera sensor *** the first stage,the proposed method combines correlation analysis with hand-eye calibration to acquire the initial values of the time offset and the extrinsic rotation parameters for IMU-LiDAR and IMU-camera respectively,and refines the time offset through aligning the rotational *** the second stage,continuous-time batch optimization framework is employed to jointly estimate the extrinsic parameters of IMU-LiDAR and the trajectory of the IMU,and the optimized trajectory is utilized to calibrate the IMU-camera extrinsic *** addition,the data association scheme is designed according to the modality of the LiDAR and the camera rather than the assistance of artificial *** validate the proposed method on both public dataset and self-record *** results show the feasibility and the performance of the proposed method through comparision with the state-of-the-art calibration methods for IMU-LiDAR and IMU-camera calibration.
The integrated inertial system, typically integrating an IMU and an exteroceptive sensor, such as radar, light detection and ranging (LiDAR), and camera, has been widely accepted and applied in modern robotic applicat...
详细信息
The integrated inertial system, typically integrating an IMU and an exteroceptive sensor, such as radar, light detection and ranging (LiDAR), and camera, has been widely accepted and applied in modern robotic applications for ego-motion estimation, motion control, or autonomous exploration. To improve system accuracy, robustness, and further usability, both multiple and various sensors are generally resiliently integrated, which benefits the system performance regarding failure tolerance, perception capability, and environment compatibility. For such systems, accurate and consistent spatiotemporal calibration is required to maintain a unique spatiotemporal framework for multisensor fusion. Considering that most existing calibration methods first, are generally oriented to specific integrated inertial systems, second, often focus on spatial-only determination, and third, usually require artificial targets, lacking convenience and usability, we propose iKalibr: a unified targetless spatiotemporal calibration framework for resilient integrated inertial systems, which overcomes the above issues, and enables both accurate and consistent calibration. Altogether four commonly employed sensors are supported in iKalibr currently, namely, IMU, radar, LiDAR, and camera. The proposed method starts with a rigorous and efficient dynamic initialization, where all parameters in the estimator would be accurately recovered. Subsequently, several continuous-time batch optimizations are conducted to refine the initialized parameters toward better states. Sufficient real-world experiments were conducted to verify the feasibility and evaluate the calibration performance of iKalibr. The results demonstrate that iKalibr can achieve accurate resilient spatiotemporal calibration.
Sensor calibration is a fundamental step for improving the performance in sensor fusion, the aim of which is to spatially and temporally register sensors with respect to each other. This paper presents a high-accuracy...
详细信息
Sensor calibration is a fundamental step for improving the performance in sensor fusion, the aim of which is to spatially and temporally register sensors with respect to each other. This paper presents a high-accuracy autocalibration method to estimate extrinsic parameters between LiDAR and an IMU. LiDAR/IMU calibration is a challenging task since the raw measurements are distorted, biased, noisy, and asynchronous. Our calibration approach adopts continuous-time trajectory estimation wherein the IMU trajectory is modeled by Gaussian process(GP) regression with respect to the independent sampling timestamps. Accordingly, the distorted and delayed LiDAR points sampled at discrete timestamps can be analytically modeled in on-manifold batchoptimization. To efficiently and accurately associate laser points with stable environmental objects, the method is carried out in known environments with a point map that is segmented as structured planes and managed by a specially designed octree map. We thoroughly investigated factors relevant to the calibration accuracy and evaluated the performance of the proposed method using both simulated and real-world datasets. The results demonstrate that the accuracy and robustness of our calibration approach are sufficient for most applications.
暂无评论