Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multicam Timestamps #11

Closed
sam598 opened this issue Jan 25, 2016 · 3 comments
Closed

Multicam Timestamps #11

sam598 opened this issue Jan 25, 2016 · 3 comments
Labels

Comments

@sam598
Copy link

sam598 commented Jan 25, 2016

My understanding is that the per frame timestamps are the amount of time since the start of the individual RealSense camera only.

When using multiple cameras simultaneously, is there a way to retrieve a relative timestamp between cameras, in order to find the time difference between exposures?

This would be extremely useful in VR and videography based applications, where multiple cameras are viewing a moving subject.

Thanks!

@sgorsten
Copy link
Contributor

Your understanding is correct. Each RealSense device has an internal hardware clock that can be used to understand the time difference between frames arriving from separate streams (depth, color, etc.) on the same device, and we abstract over this to provide frame timestamps that are coherent for that device.

Unfortunately, we do not have a clear mechanism to synchronize the timing between two independent RealSense devices viewing the same scene. Ideally you would want both devices consuming the same hardware clock signal, or you could calibrate after the fact if the mutually visible scene contained some known source of timing information, such as a blinking light or physical clock, and we can't make either assumption at the moment.

@sam598
Copy link
Author

sam598 commented Jan 26, 2016

Thanks for the quick response!

That's what I figured the case was.

Given that limitation it seems the easiest runtime solution is to create a "sync point" by turning off all of the IR emitters (on the R200, I don't know if the F200 can do it as easily), and flash one or more of the individual emitters. The difference in illumination indoors should be enough to quickly register the sync frames.

This is something individual devs will have to implement, but I just wanted to check if there was some alternative built into the library.

Thanks again!

@sgorsten
Copy link
Contributor

Using the emitter to sync several R200s is a pretty reasonable idea.

It is worth noting that using multiple cameras pointed at the same scene will typically only be viable for the R200 anyway. It uses a form of stereo matching to produce depth, and the projector is simply meant to add visible texture to the scene. Multiple R200s aimed at the same surfaces will produce even more texture and improve the quality of the resulting depth.

The F200, on the other hand, projects specific patterns and relies on their integrity to generate accurate depth. If you have multiple F200s pointed at the same scene, they will interfere with one another on surfaces they both illuminate.

dorodnic added a commit that referenced this issue Aug 21, 2017
ev-mp added a commit that referenced this issue May 19, 2019
nhershko pushed a commit to nhershko/librealsense that referenced this issue Feb 23, 2020
…amic_resolution

support dynamic resolution
YoshuaNava pushed a commit to YoshuaNava/librealsense that referenced this issue Jun 30, 2020
…ch-1

add libglu1-mesa-dev as a dependency
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants