Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: radar object fusion #1016

Merged
Merged
Show file tree
Hide file tree
Changes from 16 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
43 changes: 43 additions & 0 deletions perception/radar_fusion_to_detected_object/CMakeLists.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
cmake_minimum_required(VERSION 3.5)
project(radar_fusion_to_detected_object)

## Compile options
if(NOT CMAKE_CXX_STANDARD)
set(CMAKE_CXX_STANDARD 17)
set(CMAKE_CXX_STANDARD_REQUIRED ON)
set(CMAKE_CXX_EXTENSIONS OFF)
endif()

if(CMAKE_COMPILER_IS_GNUCXX OR CMAKE_CXX_COMPILER_ID MATCHES "Clang")
add_compile_options(-Wall -Wextra -Wpedantic)
endif()

## Dependencies
find_package(autoware_cmake REQUIRED)
autoware_package()

## Targets
ament_auto_add_library(radar_object_fusion_to_detected_object_node_component SHARED
src/radar_object_fusion_to_detected_object_node/radar_object_fusion_to_detected_object_node.cpp
src/radar_fusion_to_detected_object.cpp
)

rclcpp_components_register_node(radar_object_fusion_to_detected_object_node_component
PLUGIN "radar_fusion_to_detected_object::RadarObjectFusionToDetectedObjectNode"
EXECUTABLE radar_object_fusion_to_detected_object_node
)

## Tests
if(BUILD_TESTING)
list(APPEND AMENT_LINT_AUTO_EXCLUDE ament_cmake_uncrustify)

find_package(ament_lint_auto REQUIRED)
ament_lint_auto_find_test_dependencies()
endif()

## Package
ament_auto_package(
INSTALL_TO_SHARE
config
launch
)
76 changes: 76 additions & 0 deletions perception/radar_fusion_to_detected_object/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,76 @@
# radar_fusion_to_detected_object

This package contains a sensor fusion module for radar-detected objects and 3D detected objects. The fusion node can:

- Attach velocity to 3D detections when successfully matching radar data. The tracking modules use the velocity information to enhance the tracking results while planning modules use it to execute actions like adaptive cruise control.
- Improve the low confidence 3D detections when corresponding radar detections are found.

![process_low_confidence](docs/radar_fusion_to_detected_object_6.drawio.svg)

## Core algorithm

The document of core algorithm is [here](docs/algorithm.md)

### Parameters for sensor fusion

| Name | Type | Description | Default value |
| :----------------------- | :----- | :------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | :------------ |
| bounding_box_margin | double | The distance to extend the 2D bird's-eye view Bounding Box on each side. This distance is used as a threshold to find radar centroids falling inside the extended box. [m] | 2.0 |
| split_threshold_velocity | double | The object's velocity threshold to decide to split for two objects from radar information (currently not implemented) [m/s] | 5.0 |

### Weight parameters for velocity estimation

To tune these weight parameters, please see [docs/algorithm.md](document) in detail.

| Name | Type | Description | Default value |
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think these values require more explanation. The description is ambiguous.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fix

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you please write the commit where you fixed it?

| :----------------------------------- | :----- | :-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | :------------ |
| velocity_weight_average | double | The twist coefficient of average twist of radar data in velocity estimation. | 0.0 |
| velocity_weight_median | double | The twist coefficient of median twist of radar data in velocity estimation. | 0.0 |
| velocity_weight_min_distance | double | The twist coefficient of radar data nearest to the center of bounding box in velocity estimation. | 1.0 |
| velocity_weight_target_value_average | double | The twist coefficient of target value weighted average in velocity estimation. Target value is amplitude if using radar pointcloud. Target value is probability if using radar objects. |
| 0.0 |
| velocity_weight_target_value_top | double | The twist coefficient of top target value radar data in velocity estimation. Target value is amplitude if using radar pointcloud. Target value is probability if using radar objects. | 0.0 |

### Parameters for fixed object information

| Name | Type | Description | Default value |
| :----------------------- | :---- | :------------------------------------------------------------------------------------------------------------------------------------------------------ | :------------ |
| convert_doppler_to_twist | bool | Convert doppler velocity to twist using the yaw information of a detected object. | false |
| threshold_probability | float | If the probability of an output object is lower than this parameter, and the output object doesn not have radar points/objects, then delete the object. | 0.4 |

## radar_object_fusion_to_detected_object

Sensor fusion with radar objects and a detected object.

- Calculation cost is O(nm).
- n: the number of radar objects.
- m: the number of objects from 3d detection.

### How to launch

```sh
ros2 launch radar_fusion_to_detected_object radar_object_to_detected_object.launch.xml
```

### Input

| Name | Type | Description |
| ----------------------- | ---------------------------------------------------- | ---------------------------------------------------------------------- |
| `~/input/objects` | autoware_auto_perception_msgs/msg/DetectedObject.msg | 3D detected objects. |
| `~/input/radar_objects` | autoware_auto_perception_msgs/msg/TrackedObjects.msg | Radar objects. Note that frame_id need to be same as `~/input/objects` |

### Output

| Name | Type | Description |
| ------------------ | ----------------------------------------------------- | ------------------------------ |
| `~/output/objects` | autoware_auto_perception_msgs/msg/DetectedObjects.msg | 3D detected object with twist. |

### Parameters

| Name | Type | Description | Default value |
| :------------- | :----- | :-------------------- | :------------ |
| update_rate_hz | double | The update rate [hz]. | 20.0 |

## radar_scan_fusion_to_detected_object (TBD)

TBD
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
/**:
ros__parameters:
node_params:
update_rate_hz: 10.0

core_params:
bounding_box_margin: 2.0
split_threshold_velocity: 5.0
velocity_weight_average: 0.0
velocity_weight_median: 0.0
velocity_weight_min_distance: 1.0
velocity_weight_target_value_average: 0.0
velocity_weight_target_value_top: 0.0
convert_doppler_to_twist: false
threshold_probability: 0.4
43 changes: 43 additions & 0 deletions perception/radar_fusion_to_detected_object/docs/algorithm.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
## Common Algorithm

### 1. Link between 3d bounding box and radar data

Choose radar pointcloud/objects within 3D bounding box from lidar-base detection with margin space from bird's-eye view.

![choose_radar](radar_fusion_to_detected_object_1.drawio.svg)

### 2. [Feature support] Split the object going in a different direction

- Split two object for the low confidence object that can be estimated to derive two object.

![process_low_confidence](radar_fusion_to_detected_object_4.drawio.svg)

### 3. Estimate twist of object

Estimate twist from chosen radar pointcloud/objects using twist and target value (Target value is amplitude if using radar pointcloud. Target value is probability if using radar objects).
First, the estimation function calculate

- Average twist for radar pointcloud/objects.
- Median twist for radar pointcloud/objects.
- Twist for radar pointcloud/objects nearest of the center of bounding box in velocity.
- Weighted average twist with target value of radar pointcloud/objects.
- Twist with max target value of radar pointcloud/objects.

Second, the estimation function calculate weighted average of these list.
Third, twist information of estimated twist is attached to an object.

![estimate_doppler_velocity](radar_fusion_to_detected_object_2.drawio.svg)

### 4. [Feature support] [Option] Convert doppler velocity to twist

If the twist information of radars is doppler velocity, convert from doppler velocity to twist using yaw angle of DetectedObject.
Because radar pointcloud has only doppler velocity information, radar pointcloud fusion should use this feature.
On the other hand, because radar objects have twist information, radar object fusion should not use this feature.

![process_high_confidence](radar_fusion_to_detected_object_3.drawio.svg)

### 5. Delete objects with low probability

- Delete low confidence objects that do not have some radar points/objects.

![process_low_confidence](radar_fusion_to_detected_object_5.drawio.svg)
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading