Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

d435i IMU calibration: fixing accelerometer inaccuracies and calibrating twice. #12829

Closed
TakShimoda opened this issue Apr 4, 2024 · 8 comments

Comments

@TakShimoda
Copy link


Required Info
Camera Model D435i
Firmware Version (5.13.0.50)
Operating System & Version Ubuntu 20.04
Kernel Version (Linux Only) (5.15.0-94-generic)
Platform PC/NVIDIA Jetson Nano 4GB
SDK Version 2.54.2
Language python
Segment Robot

Issue Description

Hello, I have numerous d435i cameras mounted on turtlebot3 robots to run VIO algorithms, such as ORB-SLAM3, VINS-Fusion, OKVIS, Kimera-VIO, etc..

The issue is that the the IMU data may be causing issues with the algorithms. For example, ORB-SLAM3 works well with just VO, but crashes with VIO. OKVIS and VINS-fusion sees significant drift (e.g. a circular motion looks like an arc with a very large radius). This is even after doing the camera-IMU calibration with kalibr (including the calculations of allan variance deviations) for each of the robots (I've tested 2 so far with similar results).

I decided just to check the raw IMU data from one of the camera while it's stationary as follows:

ros2 launch realsense2_camera rs_launch.py enable_gyro:=true enable_accel:=true enable_infra1:=true enable_infra2:=true gyro_fps:=400.0 accel_fps:=250.0 unite_imu_method:='linear_interpolation'
ros2 topic echo /camera/imu sensor_msgs/msg/Imu --csv --qos-history keep_all --qos-reliability reliable > imu.csv

  • When I check the csv file, I noticed the accelerometer data had inaccuracies: x was reading about -0.13, and z about -0.45, while y was about -9.9. These values should be [x, y, z] = [0, -9.80665, 0].
  • From here, I ran the IMU calibration script and wrote the results to the camera's EEPROM.
  • When I tried again, there were still poor results, where it typically showed x in the range of -0.15 to-0.19, and z from -0.11 to -0.15. y is improved but still around -9.77. I checked the gyroscope's angular velocity values and they were all in the order of magnitude of 10^-6, which I think is fair. I also checked the motion on rs-motion, and it seemed reasonable.
    • I also checked the motion correction module in the realsense-viewer and noticed that although it does a fairly good job of correcting the y-axis with correction (about 9.40 without it vs 9.8 with it), the x and z values are still inaccurate, if not worse: x is about -0.18 with correction and -0.19 without, while z is about -0.14 with correction and about 0.05 without it.
  • I rotated the camera around and gravity did stay consistently around 9.80 (unlike for example this user, and combined with what I see with rs-motion, it seems orientation is fairly accurate, but the camera is reading noticeable acceleration in the x-z directions when it shouldn't, and I think this is causing poor performance for VIO, especially for a ground robot that only moves in the ground plane.

From here, I noticed that my IMU calibration could be improved, because I simply used my hands to try to keep the camera still during calibration at the 6 positions, but saw that the box should be used to keep all positions perfectly perpendicular.

  • I just had a question before proceeding with the calibration again. When checking the ROS2 imu topic, I noticed it already gives the "corrected" data after accounting for the first calibration I did. From the IMU calibration white paper, the intrinsics are used to correct the raw data with:

Screenshot from 2024-04-04 16-23-13

  • Would running the already calibrated data as raw measurements into a new calibration test for a new set of intrinsics be the proper way to go, or is there a way to clear the EEPROM to the factory values, and should I calibrate from that? I saw issues such as D435i Calibration #3239 and was wondering if this could be an issue.
  • Also, what is an acceptable level for the x, y, z values at rest? Should the deviation from the ideal values be in the order of 10^-6 like the gyroscope values?

Thanks.

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Apr 5, 2024

Hi @TakShimoda I recommend repeating the IMU calibration script with the camera placed on the box in order to attempt to obtain a better calibration. If the results are not an improvement on the calibration that you already have then you have the option to reject the new calibration and try again.

There are not recommended guidelines available for what 'acceptable' values for x, y and z should be.

The PDF guide for the IMU calibration tool provides an example output to compare your own calibration to.

https://www.intelrealsense.com/wp-content/uploads/2020/07/IMU_Calibration_Tool_for_Intel_RealSense-Depth_Cameras_Whitepaper.pdf

image

@TakShimoda
Copy link
Author

TakShimoda commented Apr 5, 2024

Hi Marty,

I tried multiple calibrations again. For reference, these were the values from my first test without using a box or fixture.
I loaded the previous output with the command: python rs-imu-calibration.py -i accel_raw.txt gyro_raw.txt:

waiting for realsense device...
  Device PID:  0B3A
  Device name:  Intel RealSense D435I
  Serial number:  047422071445
  Product Line:  D400
  Firmware version:  5.13.0.50
[-7.71344009e-04  2.45784192e-03  4.35787575e-05]
read 6000 rows.
[1000 1000 1000 1000 1000 1000]
using 6000 measurements.
[[ 1.01792234  0.00207567  0.02371653]
 [-0.00829282  1.0248006  -0.010505  ]
 [-0.00369977  0.03251291  1.01257168]
 [ 0.05936564  0.21121225  0.29797422]]
residuals: [ 26.53659733  29.00339935 298.70507054]
rank: 4
singular: [438.32362052 428.36067776 425.69224233  77.28904438]
norm (raw data  ): 9.630758
norm (fixed data): 9.803578 A good calibration will be near 9.806650

Here are the outputs after the new calibration with the box:

[-1.87038398e-04  2.95455423e-03  1.41366231e-05]
[1000 1000 1000 1000 1000 1000]
using 6000 measurements.
[[ 1.01716039  0.01885219  0.00621466]
 [-0.00136272  1.02397348 -0.00862725]
 [-0.01313181 -0.01931514  1.01474822]
 [ 0.0370663   0.3576599   0.23345651]]
residuals: [  7.33668257 138.25565875  41.94985437]
rank: 4
singular: [437.68635144 432.61957344 422.56733585  77.23123207]
norm (raw data  ): 9.634845
norm (fixed data): 9.804643 A good calibration will be near 9.806650

I tried lying down the camera in position 1 (mounting screws down, device facing out) and here are 10 rows of raw reading for linear acceleration:

x y z
-0.076091360555781 -9.77017835133428 -0.025125663649241
-0.076079948300011 -9.77025425919883 -0.025157538455801
-0.055187266754967 -9.90922054885094 -0.083511486716448
-0.063715491694036 -9.90082470710151 -0.083645890000136
-0.074146358643548 -9.8895307220054 -0.082781669165824
-0.061852362474143 -9.88953727121966 -0.070280316872341
-0.061141140396757 -9.89522713587496 -0.087679490376504
-0.072940751446932 -9.90705435016138 -0.137370685495802
-0.074546353141386 -9.87559505375838 -0.132699049012919
-0.067788394059013 -9.85195409532626 -0.123884726693389

These are improved, but I realized when the camera's actually mounted on the robot (reference photo below), the results wouldn't be as accurate:
IMG_2428

So I did the calibration again with the camera on the robot, although some positions (positions 2 and 4) were a little hard to hold still, and I got the following results:

[-2.88254501e-04  3.03838771e-03 -1.58022316e-05]
[1000 1000 1000 1000 1000 1000]
using 6000 measurements.
[[ 1.01448344  0.04714685 -0.02996349]
 [-0.03242957  1.02394826 -0.01973001]
 [-0.00200325  0.02102894  1.01334501]
 [ 0.04250243  0.3071155   0.25461667]]
residuals: [17.50353337  8.82432427  2.23884694]
rank: 4
singular: [439.70918697 429.78485827 424.33179608  77.25949478]
norm (raw data  ): 9.641639
norm (fixed data): 9.806368 A good calibration will be near 9.806650

Here are some sample readings:

x y z
0.037961303034369 -9.8259577141888 -0.030415055118511
0.048923273270645 -9.82547961813792 0.00224997922213
0.036971601768754 -9.83936852780892 -0.034366841041242
0.035932398187796 -9.84083681698017 -0.048926757522147
0.047933129768372 -9.82746292882352 -0.037126820140545
0.029116257012225 -9.8477014511476 -0.02376112529938
0.017204014757877 -9.8604814421949 -0.013886199361683
0.041257014319674 -9.83424779560197 -0.015103322424057
0.049873693938806 -9.83292541298501 0.000787314587065
0.050225277378593 -9.84498395110076 0.024011666833792

These are definitely the best results I have so far. I was wondering if I wanted further improvement, is it possible to add a bias to the raw data being output? This is because the offsets seem to be revolve around a constant value. I ask because I'm using a ground robot, so the roll/pitch orientations never change, so I thought it could be reasonable to try something like this, and I was wondering if I would have to change the source code of the node to do so.

I also had a couple questions for testing:

  • The wiki has SLAM tools for ROS1, but are there any available for ROS2? I wanted to test the camera on as many algorithms to see if the erroneous trajectories could be attributed to linear acceleration values, or if they're from somewhere else.
  • I also wanted to know if there were tools similar to rs-motion or the realsense-viewer where you can visualize odometry based off raw IMU values. I think I saw an issue where a user did that, but I can't find it. I thought it's a useful tool to quickly visualize and see if motion from IMU is reasonable.
  • EDIT: Sorry, forgot to ask this question, but about the united_imu_method, is this IMU preintegration? I know there's copy and linear interpolation as the methods, but where can I find more details on these? Also, could I simply use raw accelerometer and gyroscope data and put them through a separate integration script?

Thanks

@MartyG-RealSense
Copy link
Collaborator

I looked carefully though the source code of librealsense and the ROS wrapper but did not find code that would clearly achieve the bias result that you want if it was changed, unfortunately.

SLAM is typically more difficult to implement on ROS2. There is a recent discussion of various approaches to ROS2 SLAM at IntelRealSense/realsense-ros#3046

In regard to IMU data visualization, imu_tools for ROS might suit your requirements. You can find versions of it for ROS1 and ROS2 by selecting the ROS version that you want from the branch drop-down.

https://github.com/CCNYRoboticsLab/imu_tools

The workings of unite_imu_method's linear_interpolation and copy settings are discussed in detail at IntelRealSense/realsense-ros#898

Regarding implementing a separate integration script, I do not have information about that. I do apologize.

@TakShimoda
Copy link
Author

TakShimoda commented Apr 8, 2024

Hello Marty,

Thanks, I will try out imu_tools. It seems the magwick filter implemented there is what I was looking for. For the discussion on the unite_imu_method, doronhi explains here, which is 5 years old so I don't know if it's the same implementation, since in the issue you mentioned, they seem to have updated the linear interpolation scheme to come in bursts so it's periodic. I also noticed the frequency for the copy method doesn't match the explanation in issue 729 from realsense-ros, since he says it's added, but if that was the case then it would be 400(from gyro) + 250(from accel) = 650Hz, but it remains 400 (same as linear interpolation).

Regards,

@MartyG-RealSense
Copy link
Collaborator

There is not an available explanation for how unite_imu_method functions after the update of its implementation, unfortunately.

In mid 2022 the IMU component in IMU-equipped RealSense cameras was changed. So cameras manufactured before mid 2022 support 63 or 250 Accel and 200 or 400 Gyro. Cameras with the different IMU component manufactured after mid 2022 support 100 or 200 Accel and 200 or 400 Gyro.

@TakShimoda
Copy link
Author

TakShimoda commented Apr 9, 2024

Thanks Marty,

I did realize all of our cameras were purchased sometime in 2021 if I recall, so I think we should be using the old IMU. I confirmed with rs-enumerate-devices and it supported 63/250Hz and 200/400Hz for accel/gyro respectively.

Your comment did alert me to the fact that I was probably using the wrong IMU to check parameters when doing the camera-imu calibration with kalibr. I was using the datasheet from the Bosch BMI055 to check against random walk and noise density parameters when performing allan variance analysis. I couldn't really find the old IMU so I don't know what to check against when doing calibration.

@MartyG-RealSense
Copy link
Collaborator

Bosch BMI055 is the IMU used by the old cameras. The post-2022 IMU is Bosch BMI085. So the datasheet that you have been using should be okay for your cameras.

@TakShimoda
Copy link
Author

Thank you for your help Marty.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants