Driving Miss Phoebe (Not Self-Driving... Yet)
Now that the Roboclaw ROS node is configured for Phoebe TurtleBot's physical layout, and the code is running reliably instead of crashing, I can drive it around as a remote control car. This isn't the end goal, of course. If all I wanted was a remote control car there were far cheaper and easier ways to get here. Right now the goal is just to verify functionality of the drive system and that it is properly integrated into the ROS infrastructure.
In order to send driving commands to Phoebe, I need something that publishes to the command & velocity topic /cmd_vel
. On an autonomous robot, the navigation code would be in charge of sending these commands. But since it's a pretty common task to move a robot around manually, there are standard ROS packages to drive a robot via human commands. The most basic is teleop_twist_keyboard
which allows driving by pressing keys on a keyboard. Alternatively there is teleop_twist_joy
for driving via a joystick.
Those remote control (or 'tele operation' in ROS parlance) nodes worked right off the bat, which is great. But I quickly got bored with driving Phoebe around on the carpet in front of me. First I launched RViz to visualize scanning LIDAR data as I did before. This was enough for me to drive Phoebe beyond my line of sight, watching surroundings in the form of the laser scan dots. After I verified that still worked, I stepped up the difficulty: I wanted RViz to plot laser data on top of odometry data in order to verify that Roboclaw ROS node is generating odometry data correctly.
To do this, I needed to participate in the ROS coordinate transform stack, the infrastructure to track all the frames of reference for relating robot components to each other in physical space. The Roboclaw ROS node publishes a transform to relate a robot's position reference frame base_link
to the odometry origin reference frame odom
. The LIDAR ROS node publishes its data relative to its own neato_laser
reference frame. As the robot builder, it was my job to write a transform to relate neato_laser frame
to Phoebe's base_link
frame. Fortunately, ROS transform tutorials covered this exact task and I quickly got my desired RViz plot.
It looks like the LIDAR scan plot from earlier, but now there's an arrow indicating Phoebe's position and direction. The bigger change not visible in this screen shot is that RViz is now plotting in the odometry frame. This means we no longer watch strictly from Phoebe's viewpoint where robot stays in the center of screen. The plot is now in odometry frame, and Phoebe should be moving relative to the map.
I drove Phoebe forward, and was happy to see the laser scan stayed still relative to the map and the red arrow representing Phoebe moved forward. But when I started turning Phoebe, the red arrow turned one way and the LIDAR plot moved the opposite way.
Something is wrong in the coordinate transform data, time for some debugging...