-
Notifications
You must be signed in to change notification settings - Fork 0
Home
vstarlinger edited this page Mar 18, 2019
·
3 revisions
- Clone the repo
- Build and source and then use
rosdep install qr_detection
to install the necessary dependencies - If you want to test with your own webcam and the provided camera.launch file, make sure the following packages are installed as well:
- ros-melodic-video-stream-opencv
- ros-melodic-image-pipeline
- For the detection tool to work, the camera needs to have a calibration file provided. To provide a calibration file first use
roslaunch qr_detection camera.launch
to launch the camera file. Then, in a second terminal, userosrun camera_calibration cameracalibrator.py --size 8x6 --square 0.0245 image:=/camera/image_raw camera:=/camera
with this board, adjust the square size (in meters) if necessary. - Move the board in front of the camera (left right front back and tilting it) until all four bars are green. Then use the calibrate button to calibrate the camera and press commit afterwards to save the file to the default location.
- Kill all the nodes using ctrl+c
To test the QR detection use roslaunch qr_detection test_detect_individual_ar_tags.launch
. This will launch the camera as well as the necessary tf publishers to display the results accordingly. Use rviz to display the detection results by adding the tf tree as well as adding the camera raw feed and the visual makers provided by the ar_track_alvar msgs.
Markers for testing can be generated by using the rosrun ar_track_alvar createMarker 0
command. This adds a .png file of code with the desired number in the current directory.