-
Notifications
You must be signed in to change notification settings - Fork 11
Hardware in the loop
A demo of hardware-in-the-loop autonomous driving simulation system is presented in this wiki page.
I choose an RDK X5 suite from D-Robotics as the tested ECU in this demo. D-Robotics Developer Kits, abbreviated as D-Robotics RDK Suite, is a robot development kit built on the D-Robotics intelligent chip. In combination with the TogetheROS.Bot robot middleware, the D-Robotics RDK Suite can help developers quickly build robot prototypes and carry out evaluation and verification work. RDK X5 provides various functional interfaces, including Ethernet, USB, camera, LCD, HDMI, CAN FD, and 40PIN, enabling users to develop and test applications such as image multimedia and deep learning algorithms.
Here is the link to the D-Robotics RDK doc page: RDK doc
The other key part of the HIL system is the image stream injection device, which simulates the camera. I find a chip suite in TaoBao (RER-H2U-V2 TaoBao link), which works as a HDMI to USB converter. So this RER-H2U-V2 chip can take the image stream from HDMI as input, and output it through USB interface, working as an USB camera connected to the RDK suite.
The HIL frame picture shows the system setup. Besides the RER-H2U-V2 image converter, the RDK suite needs to be connected to the GaussianRPG work station through Ethernet, therefore the information such as commands can be transfered by ROS2 topics.
The time delay caused by the RER-H2U-V2 image converter is tested: first make the system times of the RDK and the work station synchronized; then show a clock window in the work station system; finally save the timestamp when the RDK catches the image from the work station.
One of the test result pictures is shown below.
The test shows the time delay is around 200ms.
Another test shows that the time delay between the cross-platform rostopics is only 1-2ms.
The RDK X5 suite has the humble version of ROS2 system in it. Put the codes in the folder /on_board into the RDK X5 (for example, in /home/sunrise/ros2_ws/nodes/), and colcon build on it.
Run the script in the work station:
./scripts/simulator_launch_hil.sh ../output/waymo_full_exp/waymo_train_002_1cam/trajectory/ours_50000/cams_tape.json ../data/waymo/training/002/track/track_info.txt 98 6 waymo_train_002_1cam.yaml 10.0 2.0 25.0
The picture shows the object detection result on the RDK X5.
The object detection process using the yolov5s on the RDK X5 takes about 100ms. Then the whole time delay in this HIL demo comes to over 300ms per step including the image converter time delay. So the parameters in the dummy AEB controller should be more aggressive to brake.