-
Notifications
You must be signed in to change notification settings - Fork 4.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Multicam Timestamps #11
Comments
Your understanding is correct. Each RealSense device has an internal hardware clock that can be used to understand the time difference between frames arriving from separate streams (depth, color, etc.) on the same device, and we abstract over this to provide frame timestamps that are coherent for that device. Unfortunately, we do not have a clear mechanism to synchronize the timing between two independent RealSense devices viewing the same scene. Ideally you would want both devices consuming the same hardware clock signal, or you could calibrate after the fact if the mutually visible scene contained some known source of timing information, such as a blinking light or physical clock, and we can't make either assumption at the moment. |
Thanks for the quick response! That's what I figured the case was. Given that limitation it seems the easiest runtime solution is to create a "sync point" by turning off all of the IR emitters (on the R200, I don't know if the F200 can do it as easily), and flash one or more of the individual emitters. The difference in illumination indoors should be enough to quickly register the sync frames. This is something individual devs will have to implement, but I just wanted to check if there was some alternative built into the library. Thanks again! |
Using the emitter to sync several R200s is a pretty reasonable idea. It is worth noting that using multiple cameras pointed at the same scene will typically only be viable for the R200 anyway. It uses a form of stereo matching to produce depth, and the projector is simply meant to add visible texture to the scene. Multiple R200s aimed at the same surfaces will produce even more texture and improve the quality of the resulting depth. The F200, on the other hand, projects specific patterns and relies on their integrity to generate accurate depth. If you have multiple F200s pointed at the same scene, they will interfere with one another on surfaces they both illuminate. |
…amic_resolution support dynamic resolution
…ch-1 add libglu1-mesa-dev as a dependency
My understanding is that the per frame timestamps are the amount of time since the start of the individual RealSense camera only.
When using multiple cameras simultaneously, is there a way to retrieve a relative timestamp between cameras, in order to find the time difference between exposures?
This would be extremely useful in VR and videography based applications, where multiple cameras are viewing a moving subject.
Thanks!
The text was updated successfully, but these errors were encountered: