-
Notifications
You must be signed in to change notification settings - Fork 4.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
RuntimeError: Frames didn't arrive within 10000 #13272
Comments
Hi @haidaumitom As your research has likely shown, RealSense cameras tend to have problems when used with Raspberry Pi boards. The best that can be hoped for is to stream depth and color alone, with no alignment, pointclouds or post-processing. Often, the situation that you have experienced will occur - depth will stream normally but RGB color frames will not be received. Usually there is not a solution when the color frames are not being delivered, except for using the RGB stream from the left infrared sensor instead of the RGB sensor. The D435 model does not support this 'color from left infrared sensor' feature though as it is only on the D405, D415 and D455-type models (D455, D456, D457, etc). However, a RealSense user at IntelRealSense/realsense-ros#3011 (comment) who was also using a Pi with Python and a ROS2 node achieved a successful solution for streaming depth and RGB color and shared it. |
Hi @MartyG-RealSense, thanks for your reply. As you said that depth_image would would works normally but RGB color frames may not be received, I have tried streaming only depth frames and it was displayed really well. However, streaming RGB color frames alone did not succeed. Additionally I also tried Henryburon's method you mentioned above but it still not worked. I think I'll moving to another camera model instead of D435. Thank you for your help! I really appreciate that. |
Hi @MiTomMi You are very welcome. Thanks very much for the update. I'm sincerely sorry that you did not get the outcome that you were aiming for. |
Case closed due to no further comments received. |
Issue Description
I installed librealsense with the LibUVC-backend installation method. The goal of my project is to create a ROS2 Publish Node on Raspberry Pi 4 (Ubuntu server 22.04) and a ROS2 Subscriber Node on my laptop (Ubuntu 22.04 desktop). After that, received data will be published to the Subscriber Node and be displayed on Rviz2 for further works. The Raspberry Pi 4's only goal is to publish all data to the laptop.
Here is my subscriber node file on the Pi 4:
import rclpy
from rclpy.node import Node
from sensor_msgs.msg import Image
from cv_bridge import CvBridge
import pyrealsense2 as rs
import numpy as np
import copy
class RealSensePublisher(Node):
def init(self):
super().init('realsense_publisher')
def main(args=None):
rclpy.init(args=args)
node = RealSensePublisher()
rclpy.spin(node)
node.destroy_node()
rclpy.shutdown()
if name == 'main':
main()
When I run the file, it seems to get the RuntimeError as written in the file:
[ERROR] [1724138568.671391456] [realsense_publisher]: RuntimeError encountered: Frame didn't arrive within 10000 [WARN] [1724138568.675346266] [realsense_publisher]: Skipping this frame due to error
The camera and depth sensor seem to be working really fine as I can see red flashes in it. In rviz2, the image topics are recognized but "No image" is shown.
I even run command

ros2 node list
, the node is created and recognized.I have read many issues (including #6628) stating that the error
RuntimeError: Frames didn't arrive within 10000
is related to the cable and unplug-replug would help, but in my case none of the solution is helpful.If anyone was stuck in such case like mine has found a solution, I would be appreciate a lot to be helped with. Thanks for reading!
The text was updated successfully, but these errors were encountered: