Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

T265 pose tracking drift for slow movements #3970

Closed
rghl3 opened this issue May 13, 2019 · 23 comments
Closed

T265 pose tracking drift for slow movements #3970

rghl3 opened this issue May 13, 2019 · 23 comments
Labels
T260 series Intel® T265 library

Comments

@rghl3
Copy link

rghl3 commented May 13, 2019

Required Info
Camera Model T265
Firmware Version 0.0.18.5502
Operating System & Version Linux (Ubuntu 16.04)
Kernel Version (Linux Only) 4.8.0
Platform
SDK Version legacy / 2.21.0
Language
Segment Robot

Issue Description

Hi,
I have connected the camera to laptop. When I move the camera slowly along the forward direction, the camera pose tracking doesn't follow the movements(almost stationary) . When moved a distance of around 6 meters, the 3d tracking shows it to be around 0.5meters. Here is the link for the video. Please have a look at it. (Grid size is 1m).

As I am using it on a rover platform moving at a slow speed, these untracked pose measurements result in accumulated drifts over long runs.

@SlavikLiman
Copy link

Hi @rghl3 ,
I noted in your video that the confidence of the tracking stays medium (yellow trace in 3D view). In order to gain high quality tracking you should cause the T265 enough motion to help it understand its local position accurately.
Moreover, it's highly recommended to use wheel odometry input in your case.
Note that you have an option to display T265's info (including translation data) by pressing on the "i" button on the 2D/3D view.
image

@rghl3
Copy link
Author

rghl3 commented May 14, 2019

Hi @SlavikLiman ,
Thank you so much for your response.

I noted in your video that the confidence of the tracking stays medium

In order to improve the confidence level, are there any suggestions like improving the features in the scene or any tuning methods/parameters? On what parameters, does the Confidence level depend upon?

In order to gain high quality tracking you should cause the T265 enough motion to help it understand its local position accurately.

By "enough motion", do you mean faster movements( thereby, exciting the imu) or noticeable changes in image features?

Moreover, it's highly recommended to use wheel odometry input in your case.

For our application, we prefer to use a stand-alone camera. But if that is the only option, then we might consider adding wheel encoder.

@SlavikLiman
Copy link

SlavikLiman commented May 14, 2019

@rghl3 ,

In order to improve the confidence level, are there any suggestions like improving the features in the scene or any tuning methods/parameters? On what parameters, does the Confidence level depend upon?

The confidence level depends on various parameters as lighting of the scene, number of features, etc. It seems form the video like your environment is rich enough in features. It's advised that the T265 gains a high confidence in order to obtain a high quality of tracking.

By "enough motion", do you mean faster movements( thereby, exciting the imu) or noticeable changes in image features?

This could be achieved by moving the T265 (probably in a slightly higher speed, than the speed your rover moves). After a high confidence is achieved you should be able to move in your original speed.

For our application, we prefer to use a stand-alone camera. But if that is the only option, then we might consider adding wheel encoder.

As I previously stated we highly recommend providing wheel odometry in order to gain best quality of tracking.

@ev-mp ev-mp added the T260 series Intel® T265 library label May 14, 2019
@rghl3
Copy link
Author

rghl3 commented May 15, 2019

Hi @SlavikLiman ,
Thanks for the valuable suggestions.

As I previously stated we highly recommend providing wheel odometry in order to gain best quality of tracking

What is the ideal frequency for wheel encoder input to t265?

Also, from the ros wrapper for librealsense, it seems only the Linear z velocity is used.
I presume it is to reduce the errors due to vibrations along the z axis. So how does it aid in tracking rover movements (as in my case)
Please correct me if I am wrong.

@SlavikLiman
Copy link

Hi @rghl3 ,
As @schmidtp1 mentioned here: "So far, the encoder precision hasn't been a limiting factor for us (since relative measurements are fused) as long as it can capture the motion of the robot approximately."

@rghl3
Copy link
Author

rghl3 commented May 16, 2019

Hi @SlavikLiman ,
Thank You for guiding me through to the wheel encoder requirements.
I will provide the wheel odometry data to T265 for better tracking. May I know, at what rate (Hz) should the wheel odometry data be provided ?

@SlavikLiman
Copy link

Hi @rghl3 ,
We used 20, 50 and 100Hz for wheel odometry in our experiments and it worked well.
You can provide a full 3D translational velocity vector to the fusion (rotational velocity from the odometry is usually less accurate than the gyroscope and thereby not used).

@rghl3
Copy link
Author

rghl3 commented May 17, 2019

Hi @SlavikLiman ,
Thank you again for the response.
I provided wheel odom data at 100 Hz from ros wrapper of a motor driver. I have set up the calib file as follows:
calibration_odometry .txt

Since I am using ros convention, I have considered wheel odom and the pose input to be in standard convention, as a result the Axis-angle representation.
The output odom has linear x data which is transformed in the Librealsense ros wrapper as -z as in here .
With the above mentioned setup, the movements were still untracked and also there was a +-0.5m drift in the output pose from SLAM.
Am I missing something in the setup? If needed, I will record a rosbag with all required msgs to reproduce the issue.

@SlavikLiman
Copy link

Hi @rghl3 ,

  1. What do you mean by: "the movements were still untracked"?
  2. How long did you drive with your rover before you gained +-0.5m drift ?
  3. Was the confidence level = High before and along the drive?

@rghl3
Copy link
Author

rghl3 commented May 21, 2019

Hi @SlavikLiman ,

Here is the link for the video

  1. As in the video above, for a movement of 7m, the tracking showed the movement to be 1m.
  2. I drove for a distance of 7m and a time of 10 seconds at the end of which I had -0.5m drift.
  3. Confidence level was high along the drive.

Are any log files needed, if in case, to reproduce the results on your part? Also, I observed more undershoots (ie -0.5m drift) rather than overshoot (+0.5m drift)

@SlavikLiman
Copy link

SlavikLiman commented May 21, 2019

Hi @rghl3 ,
Regarding #3 - as I stated above:

I noted in your video that the confidence of the tracking stays medium (yellow trace in 3D view). In order to gain high quality tracking you should cause the T265 enough motion to help it understand its local position accurately.

Please make sure that you have a high confidence (green trace) and repeat your experiment.

@rghl3
Copy link
Author

rghl3 commented May 22, 2019

Hi @SlavikLiman ,
After initial excitation for a second, the confidence level changed from Medium to High. The Confidence level was high for the whole run. But even then, there was a drift of -0.5m approx for 7m, when ran multiple times.Is there any possibility for this to be a hardware issue? If so, should I send any log files or stream datas for your reference?

@SlavikLiman
Copy link

Hi @rghl3 ,
Thanks for the clarification that the mentioned issue isn't related to the one depicted in your video.
Let me summarize what is the current status, as I understand it:

  1. You are able to drive 7m, while the T265 reports that you drove approx. 6.5m
  2. During the whole 7m drive the confidence = High
    Please approve above.
    Have you tried manually moving the T265 for those 7m without a rover?

@rghl3
Copy link
Author

rghl3 commented May 24, 2019

Hi @SlavikLiman ,
Yes, I am sure of the above two points.
I also tried with and without rover. On an average, 2 out of 5 times, the pose estimation is nearly accurate. In the remaining, the above issue persists.

@RealSenseCustomerSupport
Copy link
Collaborator


Hi @rghl3

One more question prior to us pointing you to do a return/exchange to see if it is a HW issue
I just want to make sure that you're taking into account your mounting location of the T265 on the rover and the location of center of tracking with regards to how you're measuring the distance traveled by the rover and ultimately by the T265 device. Rover may travel 7m but depending on the location of the T265 mounted on the rover, T265 itself may be traveling a bit less.

Thanks

@rghl3
Copy link
Author

rghl3 commented May 28, 2019

Hi @RealSenseCustomerSupport ,
We measured the t265 pose on the realsense viewer application without rover using the info tab as said here.

@rghl3
Copy link
Author

rghl3 commented May 28, 2019

Hi @SlavikLiman and @RealSenseCustomerSupport,
Considering the limitations of Visual Inertial algorithm, is this the expected output of T265 slam? What is the maximum accuracy that could be achieved by T265?

@SlavikLiman
Copy link

@rghl3 ,
The T265 SLAM is designed to provide an accurate position, when confidence is High along its path.
Regarding the maximum accuracy question, you probably mean the expected minimal accuracy of the device. According to https://www.intelrealsense.com/visual-inertial-tracking-case-study/ : "Under 1% drift observed in repeated testing in multiple use cases and environments. AR/VR use cases were tested with the T265 mounted on the head in indoor living and office areas with typical indoor lighting including sunlight entering the room. Wheeled robot use cases tested with wheel odometer data integrated, again in indoor office and home environments."

@rghl3
Copy link
Author

rghl3 commented Jun 3, 2019

Hi @SlavikLiman,
Sorry for the late reply.

indoor living and office areas with typical indoor lighting including sunlight entering the room

We tested the camera in the above said conditions. We could obtain the claimed accuracy(1% drift) for 60% of the runs. However the accuracy dropped to around 5% for the remaining, thus being inconsistent.
The testing conditions were the same (lighting and features).

Under 1% drift observed in repeated testing in multiple use cases and environments

What are the suggested requirements to achieve the above accuracy consitently (such as lighting, features)?

@SlavikLiman
Copy link

Hi @rghl3 ,
It seems like your floor is very reflective and might confuse the tracking algorithm. Could you repeat the experiment on a less reflective surface as carpet? General question: have you tried to repeat your experiment in a different environment (e.g. different room)?

@rghl3
Copy link
Author

rghl3 commented Jul 23, 2019

Hi @SlavikLiman,
The performance varies depending on the environment. For a less reflective surface like carpet, the performance is better. But for a vast area, the performance is limited.

@rghl3
Copy link
Author

rghl3 commented Feb 26, 2020

Hi @SlavikLiman,

1. As in the video above, for a movement of 7m, the tracking showed the movement to be 1m

In continuation to the above, I get the following errors for the untracked movement.

(tm-device.cpp:1289) SLAM_ERROR Speed
and
(types.cpp:49) Out of frame resources!

I am running t265 and d415 simultaneously. The slam error speed occurs when I rotate the camera, and then out of frame resources error, followed by untracked movements. Any ideas on what might be going wrong?

@RealSenseSupport
Copy link
Collaborator

Thank you for highlighting the drift issues occurring with the T265 tracking system. We have moved our focus to our next generation of products and consequently, we will not be addressing this issue in the T265.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
T260 series Intel® T265 library
Projects
None yet
Development

No branches or pull requests

5 participants