Slam calibration (three) vio system

Slam calibration (three) vio system

Visual calibration (3) vio calibration

1. The principle of vio calibration

 Vio is composed of a binocular camera and imu, which correspond to the camera coordinate system and the body coordinate system respectively. We need to calibrate the pose matrix between the two coordinate systems and the time offset of the image and the imu data.

 We use the B-spline curve to calibrate the vio system. B-spline curve is an extension of Bezier curve. Bezier curve is a kind of higher-order curve, which can be used for the design of complex models. However, it has two disadvantages. One is that the degree is too high, and the other is that it changes the local influence on the whole. . In response to these two shortcomings, the B-spline curve came into being. The B-spline curve realized the specified order and realized the splicing of Bezier curves.

 In the calibration, we only use the B-spline curve to realize the establishment of the residual equation. Here we will not elaborate on the specific principle of the B-spline curve, but only introduce its basic use. The B-spline curve is mainly composed of control points, nodes, and a basic function table. As long as we specify the control points and the number of nodes, the B-spline curve can be estimated according to the basic function table, as shown in the figure below, square control points, triangles node.

 The camera can only get the pose data from the original image, while imu can get the three-axis acceleration and angular velocity. Obviously there is no direct connection between the two. Therefore, we estimate the camera through the B-spline curve and establish the residual equation with imu. Acceleration error, angular velocity error, zero offset error, and pose motion error are added to the BA optimization equation, and the nonlinear optimization iteration is performed, as shown below:

for cam in self.CameraChain.camList:
    cam.findTimeshiftCameraImuPrior(self.ImuList[0], verbose)

##########################################
## add error terms
##########################################
#Add calibration target reprojection error terms for all camera in chain
self.CameraChain.addCameraChainErrorTerms(problem, self.poseDv, blakeZissermanDf=blakeZisserCam, timeOffsetPadding=timeOffsetPadding)

# Initialize IMU error terms.
for imu in self.ImuList:
    imu.addAccelerometerErrorTerms(problem, self.poseDv, self.gravityExpression, mSigma=huberAccel, accelNoiseScale=accelNoiseScale)
    imu.addGyroscopeErrorTerms(problem, self.poseDv, mSigma=huberGyro, gyroNoiseScale=gyroNoiseScale, g_w=self.gravityExpression)

    # Add the bias motion terms.
    if doBiasMotionError:
        imu.addBiasMotionTerms(problem)

# Add the pose motion terms.
if doPoseMotionError:
    self.addPoseMotionTerms(problem, mrTranslationVariance, mrRotationVariance)

# Add a gravity prior
self.problem = problem

 The time offset still uses the B-spline curve to estimate the rotation angular velocity of the camera, takes the gyroscope angular velocity as the measurement, and performs cross-correlation calculations to obtain the offset index.

2. Calibration steps

 We still use the kalibr tool for calibration. When collecting vio data, try to move as much as possible compared to binocular data to fully stimulate imu. For the generated bag file, we use the calibration results of imu and cams before, and then execute the following command Calibration:

kalibr_calibrate_imu_camera --bag voi.bag --target 2april_6x6.yaml --imu imu_bmx160.yaml --cams camchain-cam.yaml

 After waiting for a long time, the cam0, cam1, gyroscope, acceleration errors are shown in the figure below. It can be seen from the figure that the reprojection error of cams is greater than 2 pixels, and the calibration result is not very good. Finally, perform dual target calibration, gyroscope and acceleration. The error is normal:

 The final calibration results are as follows:

cam0:
  T_cam_imu:
  -[-0.021640613967887967, 0.9996846699952313, 0.012737519523803142, 0.0943689450739029]
  -[-0.05318702193582009, -0.01387363738123537, 0.9984881886549353, -0.016075204084512736]
  -[0.9983500510966545, 0.020930426711856576, 0.0534704845028372, -0.002524643123212848]
  -[0.0, 0.0, 0.0, 1.0]
  cam_overlaps: [1]
  camera_model: pinhole
  distortion_coeffs: [-0.029772066960790305, 0.06754143559647674, -0.05439220501790035,
    0.013562369372802113]
  distortion_model: equidistant
  intrinsics: [288.42279508017543, 288.2501118514416, 314.48227248976, 209.13222129484865]
  resolution: [640, 400]
  rostopic:/cam0/image_raw
  timeshift_cam_imu: 0.038078317290195576
cam1:
  T_cam_imu:
  -[0.01016316997918798, 0.9998444491129306, -0.014414837981561512, -0.020421941435881386]
  -[-0.03860727630216526, 0.014797185810593971, 0.9991448951521571, -0.011343342336408197]
  -[0.999202776313448, -0.009597961770464236, 0.038751657220695845, -0.003410126461149912]
  -[0.0, 0.0, 0.0, 1.0]
  T_cn_cnm1:
  -[0.9991256216395883, -0.028805073535218002, 0.03030280386253717, -0.11509491601667944]
  -[0.028354632585837286, 0.9994824918256658, 0.015190896776503355, 0.0020860975265854184]
  -[-0.030724696812546234, -0.01431838931524384, 0.9994253232399051, 0.0017823520038439426]
  -[0.0, 0.0, 0.0, 1.0]
  cam_overlaps: [0]
  camera_model: pinhole
  distortion_coeffs: [-0.028122535573276014, 0.06241390797743572, -0.050088085109920036,
    0.012376401933419342]
  distortion_model: equidistant
  intrinsics: [289.36201843178367, 289.2052912742452, 312.7097758328454, 210.45276733552467]
  resolution: [640, 400]
  rostopic:/cam1/image_raw
  timeshift_cam_imu: 0.03760055980093673

SLAM calibration series articles

1. Slam calibration (two) binocular stereo vision

2. Slam calibration (1) monocular vision

3. IMU calibration (three) calibration to determine the error

4. IMU calibration (two) random error calibration

5. IMU calibration

Reference: https://cloud.tencent.com/developer/article/1802608 slam calibration (3) vio system-cloud + community-Tencent Cloud