The automatic calibration for red ball demo includes two (independent) steps:
The calibColor
module calibrates the color of the image provided as input by maintaining the white balance.
In the red ball demo, calibColor
is used as input to pf3dTracker
.
An example of output is the following:
icalibColor.mp4
The devised procedure estimates the offsets required by demoRedBall
merging the information from vision and from the cartesian controllers: the offset is estimated as difference between the end-effector provided by the controller (green) and the ball center provided by the visual tracker (yellow), discounted by its radius.
More specifically, the devised steps are the following:
- run demoRedBal application and its dependencies, without launching
demoRedBall
module (which is used for reaching and grasping); - run the icub skin;
- run
calibOffsets
and connect; - open a terminal and type:
where
yarp rpc calibOffsets/rpc lookAndCalibrate part
part
is the arm you want to calibrate. This will move the desired arm/hand in a predefined position and the gaze to look towards the end-effector. Now you can push the ball into the middle of the palm of the hand (to trigger calibration) (blue arrow in image). Only when the offsets between end-effector (green) and tracking centroid (yellow) will be computed, the service will provide an acknowledgment:
Note: the reaching and the grasping offsets contain a ball radius and a ball diameter respectively to avoid the hand touching the ball while reaching.
-
in the same terminal type:
writeToFile part
This will create a file called
calibOffsetsResults.txt
in thecalibOffsets
context. -
run the script
copyParamsRedBall.sh calibOffsetsResults.txt
to write the offsets intodemoRedBall
configuration file.
Now we can run demoRedBall
with the newly adapted conf file and do all required connections.
If you want to run the automatic deployment, you can refer to the instructions in the Applications section of Robot Bazaar.
The following video shows the entire pipeline working on the robot.