The aim of this project is to provide the Robot-Assisted-Inspection-and-Repair team of the German Space Center (DLR) - Maintenance, Repair and Overhaul Institute (MO) and the community of robotic inspection using a set of hardware and software (ROS2) tools for addressing the challenges of manual and automatic inspection methods performed by the integrated system: UR10e robotic arm + Eeloscope2 (sucessor of Eeloscope1), which feature:
- 3 RGB-D cameras Intel Realsense D435
- 6 LEDs for the cameras' view
Since there exists not much open-source documentation about these topics in robotic inspection, this project was proposed to fill that research gap and to encourage modular and continuous research. The potential contribution to the audience is to enable technology transfer possibilities to other systems.
The use case utilized in this study was the inspection of a aircraft (Boeing 737-700) wing fuel tank. A research article with the methodology, analysis, approach and results of the experiments is intended to be published soon on the MDPI Open Access Journal.
Developed packages of this repo:
- futama2_description -> urdf, IP, and controllers setup, robot state publisher, STL, meshes, ...
- futama2_moveit_config -> config for moveit and planning scene
- futama2_teleop -> demo of manual and automatic teleoperations, foto capturing, higher-level functions for robotic inspection
- futama2_utils -> utilities (so far only one for the moveit planner)
A high-level overview can be graphically summarized with the following picture:
- Preparation and Recommendations
- Installation of Packages
- Docker
- Robot Usage
- Maintainers, Contributing & License
- Install the ROS2 Jazzy distro for Ubuntu 24.04 LTS Jammy
- Follow the ROS System Setup basics (if you have access to the DLR wiki)
- Generally, on each repo, the main mantainers will explain the installation steps for their package (e.g. usually one needs to manage dependencies and maybe other additional steps.
The installation steps mentioned here are only documented based on the user experience of the mantainer, but some are to be included in the dependencies of the futama2_packages for easier installation (such as Moveit2).
Packages that will manage the connection with the cameras. The model of the cameras [D405]shttps://www.intelrealsense.com/depth-camera-d405/) was tested out (D435i posible without using the imu).
Required binaries:
-
Here the instructions for installing the librealsense (SDK) package (coming from "Step2-Option1-Linux Debian Installation"), since one could encounter multiple issues if building from source (not recommended).
-
Other possible required binary packages to be installed with apt install ros-jazzy-«package_name»: xacro, diagnostic_updater, launch_pytest,...
Building from source:
-
The ros2-master branch of the wrapper, (coming from "Step3-Option2-Install").
-
Since the rosdep can't resolve the librealsense installed binaries, it is recommended to build from source the master branch of the librealsense package too.
Troubleshooting:
-
Issues #1225 and #10988 can be avoided by disabling the Secure Boot (only if you present this error, otherwise, ignore this bullet point).
-
Other problems when building might be related to USBs (kernel) or binary installation (debian packages). One can solve them by simply search the log output on the web.
-
Be aware of the manual plug / unplug of the cameras when following the installation steps. Sometimes one might make mistakes and require to start over again.
- After proper isntallation, one should be able to open the Realsense Viewer with
realsense-viewer
, which will serve just as an installation checker. - All documentation about calibration can be found in: Overview, Tunning Best Performance (execute the Depth Quality Tool with
rs-depth-quality
), Self-Calibration (the On-Chip Calibration only considers exstrinsic parameters), and the Dynamic Calibration Tool (open withusr/bin/Intel.Realsense.DynamicCalibrator
).
Troubleshooting:
- If an error about "GLFW Driver Error" #8661 rises when opening the realsense-viewer, restarting the computer can fix this.
- After building the workspace, one should now be able to test the main launch file with:
ros2 launch realsense2_camera rs_launch.py pointcloud.enable:=true align_depth.enable:=true
and visualize the topics on rviz2
. The four most useful topics are:
- /camera/camera/color/camera_info
- /camera/camera/color/image_rect_raw
- /camera/camera/depth/image_rect_raw
- /camera/camera/depth/color/points
For more information please consult the official RealSense usage section.
Packages that enable robotic manipulation featuring motion planning, 3D perception, kinematics, etc. Official installation steps here.
Required binaries:
- osqp_vendor, ament-cmake-google-benchmark, stomp, ros_testing
Build from source:
- Moveit2. If using a separate git directory, the generated packages need to be linked from git directory to the ws: moveit2, moveit_msgs, moveit_resources, moveit_visual_tools, generate_parameter_library.
Dependencies not resolved -> urdfdom and urdfdom_headers are installed automatically but not recognized by moveit. They are not required for the purposes of the futama2 project.
ROS2 manipulator drivers for the lightweight UR robotic manipulators.
Required binaries: only the "ur_msgs" package. The rest of the binaries for Jazzy seem to be in "passing" (2023) phase, but it would be better to do it from source.
Build from source:
- ros2 branch from moveit_visual_tools
- main branch from Universal_Robots_ROS2_Driver
- rolling branch from Universal_Robots_ROS2_Description
- master branch from Universal_Robots_Client_Library
Please follow the how-to-setup-the-connection instructions between external computer and the UR10e robot to link and test the hardware.
The Octomap is an efficient probabilistic 3D Mapping framework Based on Octrees. This stack helps for mapping/voxelizing the environment, which helps for the obstacle avoidance functionality when performing the motion planning from Moveit2.
Required binaries:
- octomap, octomap-ros, octomap-server, octomap-mapping, octomap-msgs, octomap-rviz-plugins
Package required to control de 6 DoF Spacemouse.
Required binaries: spacenav
- Quick test: Visualize the topics by running the node with
ros2 run spacenav spacenav_node
Will be done in the coming weeks. The installation steps were already followed and they work, but a docker and a .repos file can be added to enable an easier installation. TODO
The setup shown in this diagram is needed for the robot to work properly (for mock and real environments). If the spacemouse is not available, the robot can still be operational with the keyboard, but the foto capture becomes trickier. The best is to have the complete setup.
The robot is configured by default to use the join_trajectory_controller (Mode T on the keyboard_node terminal). Move the visual marker of the Moveit Motion Planning plugin in Rviz and press the "Plan & Execute" button before changing to Cartesian mode (C on the second terminal). Mode change is also possible using the buttons of the simple spacemouse (not the new versions at the moment).
Set the initial positions in: futama2_description/config/initial_positions.yaml. Currently, there are two which were found suitable for wing inspection and auto oip object inspection.
ros2 launch futama2_teleop teleop_demo.launch.py mode:=mock insp_mode:=manual camera_mdl:=d405 spacemouse_mdl:=pro multicam:=false octomap:=false
ros2 run futama2_teleop keyboard_node
- Change to
spacemouse:=simple
if the spacemouse with only two buttons is available, orfalse
if not available. - Change to
mode:=real
when using real hardware. - Change to
camera_mdl:=d435i
when debugging at e.g. home and only d43i camera available - Change to
multi_cam:=true
if the three D405 cameras are used - Change to
octomap:=true
if voxelization for newly discovered objects is required for obstacle avoidance - Change to
insp_mode:=auto_minimal
if you want to run a short demo on moving to a pose nearby the wing - Change to
insp_mode:=auto_oip
if you want to run a short demo of automatic inspection based on the OIP inspection planner
- If you want to launch the motion APIs standalone, change
insp_mode:=manual
and:
ros2 launch futama2_teleop minimal_motion_planner_api.launch.py
ros2 launch futama2_teleop auto_insp_oip.launch.py
When using the real robot: ros2 param set /camera/camera depth_module.enable_auto_exposure true
TODO : CHECK WHAT TO SHOW FROM PICS, MAYBE MAKE A COLLAGE:
Additional terminal for photo capturing ("screenshots") function by by pressing both side buttons of the spacemouse. If the spacemouse is not available, this function has to be simulated by publishing the following to the /spacenav/joy topic:
ros2 topic pub --once /spacenav/joy sensor_msgs/Joy \ '{ header: { stamp: 'now', frame_id: "realsense_center_link"}, axes: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0], buttons: [1, 1]}'
ros2 topic pub --once /spacenav/joy sensor_msgs/Joy \ '{ header: { stamp: 'now', frame_id: "realsense_center_link"}, axes: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0], buttons: [0, 0]}'
- The picture should be now saved then in your workspace or current directory.
In order to make a 3D reconstruction of the inspected object following the manual or automated inspection experiments, one requires the use a rosbag with the following topics recorded:
- «each camera prefix»/color/image_rect_raw
- «each camera prefix»/depth/color/points of each camera
- «each camera prefix»/depth/camera_info of each camera
- /tf
- /tf_static
- /monitored_planning_scene
- /robot_description
Once the rosbag is stored, it can be used as the input for the already published Vinspect package. Example:
-
Motion Planner not working: if you tried to change the controller to all_controller (scaled_joint_trajectory_controller), and the robot still doesn't move, it is better to cancel all processes (even from the UR surface) and start from scratch to avoid dangerous robot jumps between each aggresive movement.
-
Problem with: moveit/moveit2#1049 when launching main command (no moveitplanner appearing, alphanumeric problem with ubuntu), solution:
export LC_NUMERIC="en_US.UTF-8"
. If the error persists, repeat the installation steps for the moveit packages.
- Adrian Ricardez Ortigosa adrian.ricardezortigosa@dlr.de
- Dr. Marc Bestmann marc.bestmann@dlr.de
If you’re interested in contributing to the FuTaMa2 project, there are several ways to get involved. Development of the project takes place on the official GitHub repository. There, you can submit bug reports, feature requests, and pull requests. Even and especially when in doubt, feel free to open an issue with a question. Contributions of all types are welcome, and the development team is happy to provide guidance and support for new contributors.
Additionally, the robot-assisted-repair@dlr.de mailing list is available for discussion and support related to the project.
This work is licensed under multiple licenses:
All original source code, configuration, and documentation is licensed under MIT.
The code borrowed is licensed under Apache-2.0.
For more accurate information, check the individual files.