-
Notifications
You must be signed in to change notification settings - Fork 4.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
d435i IMU calibration: fixing accelerometer inaccuracies and calibrating twice. #12829
Comments
Hi @TakShimoda I recommend repeating the IMU calibration script with the camera placed on the box in order to attempt to obtain a better calibration. If the results are not an improvement on the calibration that you already have then you have the option to reject the new calibration and try again. There are not recommended guidelines available for what 'acceptable' values for x, y and z should be. The PDF guide for the IMU calibration tool provides an example output to compare your own calibration to. |
I looked carefully though the source code of librealsense and the ROS wrapper but did not find code that would clearly achieve the bias result that you want if it was changed, unfortunately. SLAM is typically more difficult to implement on ROS2. There is a recent discussion of various approaches to ROS2 SLAM at IntelRealSense/realsense-ros#3046 In regard to IMU data visualization, imu_tools for ROS might suit your requirements. You can find versions of it for ROS1 and ROS2 by selecting the ROS version that you want from the branch drop-down. https://github.com/CCNYRoboticsLab/imu_tools The workings of unite_imu_method's linear_interpolation and copy settings are discussed in detail at IntelRealSense/realsense-ros#898 Regarding implementing a separate integration script, I do not have information about that. I do apologize. |
Hello Marty, Thanks, I will try out imu_tools. It seems the magwick filter implemented there is what I was looking for. For the discussion on the unite_imu_method, doronhi explains here, which is 5 years old so I don't know if it's the same implementation, since in the issue you mentioned, they seem to have updated the linear interpolation scheme to come in bursts so it's periodic. I also noticed the frequency for the copy method doesn't match the explanation in issue 729 from realsense-ros, since he says it's added, but if that was the case then it would be 400(from gyro) + 250(from accel) = 650Hz, but it remains 400 (same as linear interpolation). Regards, |
There is not an available explanation for how unite_imu_method functions after the update of its implementation, unfortunately. In mid 2022 the IMU component in IMU-equipped RealSense cameras was changed. So cameras manufactured before mid 2022 support 63 or 250 Accel and 200 or 400 Gyro. Cameras with the different IMU component manufactured after mid 2022 support 100 or 200 Accel and 200 or 400 Gyro. |
Thanks Marty, I did realize all of our cameras were purchased sometime in 2021 if I recall, so I think we should be using the old IMU. I confirmed with Your comment did alert me to the fact that I was probably using the wrong IMU to check parameters when doing the camera-imu calibration with kalibr. I was using the datasheet from the Bosch BMI055 to check against random walk and noise density parameters when performing allan variance analysis. I couldn't really find the old IMU so I don't know what to check against when doing calibration. |
Bosch BMI055 is the IMU used by the old cameras. The post-2022 IMU is Bosch BMI085. So the datasheet that you have been using should be okay for your cameras. |
Thank you for your help Marty. |
Issue Description
Hello, I have numerous d435i cameras mounted on turtlebot3 robots to run VIO algorithms, such as ORB-SLAM3, VINS-Fusion, OKVIS, Kimera-VIO, etc..
The issue is that the the IMU data may be causing issues with the algorithms. For example, ORB-SLAM3 works well with just VO, but crashes with VIO. OKVIS and VINS-fusion sees significant drift (e.g. a circular motion looks like an arc with a very large radius). This is even after doing the camera-IMU calibration with kalibr (including the calculations of allan variance deviations) for each of the robots (I've tested 2 so far with similar results).
I decided just to check the raw IMU data from one of the camera while it's stationary as follows:
ros2 launch realsense2_camera rs_launch.py enable_gyro:=true enable_accel:=true enable_infra1:=true enable_infra2:=true gyro_fps:=400.0 accel_fps:=250.0 unite_imu_method:='linear_interpolation'
ros2 topic echo /camera/imu sensor_msgs/msg/Imu --csv --qos-history keep_all --qos-reliability reliable > imu.csv
rs-motion
, and it seemed reasonable.From here, I noticed that my IMU calibration could be improved, because I simply used my hands to try to keep the camera still during calibration at the 6 positions, but saw that the box should be used to keep all positions perfectly perpendicular.
Thanks.
The text was updated successfully, but these errors were encountered: