-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Enable odometry/kinematics pose estimation in the path_follow.py template #948
Comments
An excellent roadmap for the path forward. |
Related to #4, the T265 uses a coordinate system that is common to virtual reality where the z-axis and x-axis form the driving plane. It is more common in robotics to use a coordinate system where the x-axis and y-axis form the driving plane. The T265 coordinates can be converted to this system with a simple transformation. |
the gps_logger branch has code for kinetics and speed control in parts/tachometer.py, parts/odometer.py, parts/kinematics.py, parts/velocity.py. The state is that
Encoders and kinematics can be used to record and follow a path since it is repeatable, but it is not actually correct. If you plot the path you can see it way oversteers and so creates a 'looping' looking path. So neither position nor heading is actually correct. My theory is that because I've separated out the parts so much and they are either running on the vehicle loop or in separate threads, we don't get very fast and consistent updates. Really we want updates at more like 100hz than 20hz for kinematics to be accurate. So if that theory is correct, then the thing to do is make a more monolithic part that handles encoders->tachometry->odometer->kinematics in a single pipeline (a single part) that can then be run in it's own thread so it goes as fast as possible. So we can use the current parts and wrap them in an another part to turn them into a monolithic pipeline. The start that part in a thread. The issue I believe is that |
See PR #1089 for latest. Status is that this branch has been successfully testing with MOCK encoder implementation for both Bicycle and Unicycle configurations and has been successfully tested in the real world using Unicycle configuration. I'm just fixing the unit tests which broked due to the refactor, then the PR can be opened. |
Completed in PR #1089 |
path_follow.py allows the user to record a path and then have the vehicle drive that path. The drive mode will record waypoints on the path that the user manually drives (this should be a closed loop). The autopilot mode will then drive this path over and over.
Currently, path_follow.py template is dedicated to using the Intel RealSense T265; there are no other supported ways to record points on the path or to follow the recorded path. There are other ways to provide pose estimates to create a path. The most straight forward is odometry coupled with forward kinematics for pose estimation. We could also use IMU data and fuse this with other sources. This would also be the template to add RTK GPS support.
For now, this ticket will generalize the path_follow.py template to support odometry and forward kinematics (both unicycle and bicycle) for pose estimates. While I am at it, I will generalize support for drive-trains so it shares the same code as the complete.py template (it is currently limited to
I2C_SERVO
). This would then allow any kind of supported robot architecture to use the path_follow.pyThis is a first step in making the path_follow.py template a more usable and powerful template. It represents a different way to navigate a track; using a map rather than a neural network. This is how the best performers at the DIYRobocars races do it. Ultimately we want to add more ways to localize the robot, a more general map format, utilities for calculating an optimal path and more algorithms for following that path.
The text was updated successfully, but these errors were encountered: