This is the fourth in a series of blogs on the Open Source Mars Rover. Most concepts discussed here apply to other wheeled robots as well. You can read about parts 1, 2, and 3 at these links.
When I send my Mars Rover on missions to the grocery store or to drop off another robot at a colleague’s during the quarantine and my cellular connection isn’t great or I don’t want to micromanage it, I need the robot to drive autonomously. For that the robot has to understand where it is in the world and where obstacles are. Over the past few years, that’s become quite straightforward. We’ll walk through the steps to set up the two most commonly used packages for 2D SLAM and localization, gmapping and amcl, to produce high-fidelity maps of the robot’s environment. Use SLAM to create a map, then use localization to continuously update the robot's location once that map is created.
Don't Go Anywhere without Rock-Solid Odometry
The package we’ll be looking at is designed to work with lidar and any kind of odometry. Various other robust solutions exist for when you don’t have lidar data, but since lidar is rapidly becoming a default on lots of mobile bases, we’ll focus on gmapping which is robust and has a simple interface. There are more advanced localization packages out there, but the reality is that poor quality maps are often the result of bad odometry or uncalibrated sensors, and gmapping actually works extremely well and has a smaller footprint. It’s tempting to get SLAM set up, but you will save a lot of time if you dial in your odometry first. Learn how to do that in the previous episode, Deploying on Mars: Rock solid odometry for wheeled robots.
-1.jpeg?width=574&name=Mars%20Rover%20blog%20mapping%20on%20a%20mobile%20robot%20(1)-1.jpeg)
Before you set up mapping, make sure your odometry works well. On the left, a bug in odometry causes an occasional slip-up bad enough to ruin the entire map!
Create a Map with Gmapping
Under the hood, gmapping uses a Rao Blackwellized particle filter. In simple terms, the algorithm generates a number of robot positions (called particles) at successive time steps based on the supplied odometry which are then assigned a weight based on how well their position matches the lidar scan that is actually perceived. The Rao Blackwell theorem means that the filter minimizes the mean-squared-error.
Let’s get it set up:
- Verify that your lidar is up and running and you can see the scan data at a frequency of at least 3Hz.
- Make sure your tf2 tree is set up correctly. There should be a static link between the base_link and the lidar frame. Verify that the lidar is in the correct position relative to the base_link. A good way to check this is to rotate your robot in place and check that the lidar scans of nearby objects remain in place. Your tree should look like odom -> base_link -> lidar_frame.
<node name="tf2_static_pub_base_link_to_lidar"
pkg="tf2_ros"
type="static_transform_publisher"
args="-0.01 0 -0.01 0 0 0 1 base_link cloud">
</node> - Install gmapping: `sudo apt install ros-melodic-slam-gmapping`, replacing melodic with your distribution (kinetic or noetic)
- Create the following launch file in a new or existing package, or add it to an existing launch file:
- Now launch the file using roslaunch. If all goes well, you shouldn’t see any errors.
<launch>
<node name="slam"
pkg="gmapping"
type="slam_gmapping">
<remap from="scan" to="/my_lidar/scan"/>
<rosparam param="srr">0.1</rosparam>
</node>
</launch>
Change the scan topic to match your lidar’s output topic. I’ve added an example parameter (odometry error in translation) and set it to the default.
In Freedom, make sure you enable the /map topic in bandwidth. Next, go to SETTINGS > PILOT > ENVIRONMENT and make sure the map topic is set to /map. Go back to the PILOT view and you should start seeing a map topic visualizing! Drive the robot around and see if the map gets updated with new landmarks. Don’t worry if it doesn’t look perfect on the first try! For now, upload the map at 1Hz or higher frequency so we can see the changes, but once the map stops changing, you can lower it to every 30 seconds or only upload it once.
View of the map from the Pilot tab on the Freedom Robotics web app. The camera view and lidar inserts allows you to create a map from anywhere in the world.
Saving the map for later
Gmapping doesn’t automatically save the map file for you, so you’ll need to save it while gmapping is running. To do that, we’ll use map_server (`sudo apt install ros-melodic-map-server`).
rosrun map_server map_saver -f my_map map:=/map
You should see two new files: my_map.pgm and my_map.yaml. Open the pgm file and it should show your map!
Open your .pgm file in any graphics editor and fix mistakes or add no-go zones by coloring onto it. Also useful for when objects have changed.
Localizing using AMCL
Gmapping will always start from scratch, but you probably want to reuse the map you created before instead of creating a new one from scratch each time. That’s where localization with amcl (adaptive Monte Carlo localization) comes in. It works similarly to SLAM, except it just localizes and doesn’t create a map.
First, let’s upload the map you just created to ROS, again using map_server. Create a new launch file, amcl.launch, and add the following:
<launch>
<arg name=”path_to_map” default=”./my_map.yaml”/>
<node name="map_server"
pkg="map_server"
type="map_server"
args="$(eval arg(path_to_map))">
</node>
<node name=”amcl”
pkg=”amcl”
type=”amcl”
<remap from="scan" to="/my_lidar/scan"/>
</node>
</launch>
The interface and inner workings to amcl are very similar to that of gmapping. Now roslaunch this file, move your robot around and check that it’s improving upon the localization estimate provided by the odometry.
Beyond gmapping and amcl
I’ve found that a properly tuned particle filter works just as well as an extended or unscented Kalman filter. Pick packages that you understand, look at any open issues, questions people are asking about it on answers.ros.org, and if there’s still any activity on the package’s repo, which are good indicators of community support.
- Outdoors, with GPS: robot_localization
- Using an IMU
For 3D mapping, I recommend RTABmap. It’s well-supported, powerful, and works with various sensors out of the box. It’s inherently more complex and computationally intensive than any 2D mapping package, so I would only suggest it if you absolutely need 3D mapping and localization. With a lot of finetuning, I’ve gotten millimeter level accuracy before using depth cameras, lidar, and wheel odometry.
Instead of investing effort into creating high-fidelity maps and accurate localization, you can also consider ‘going map-light’. Instead of blindly relying on global localization, it could be a good idea to instead program your robot to react to ad hoc sensory information. If you’re building a pallet moving robot, use the map to navigate the robot to the approximate location and then switch to a local controller based on features or fiducials on the pallets. Regardless, having accurate localization and mapping makes navigation a lot easier.
I’m always excited to hear any feedback or help answer any questions: achille@freedomrobotics.ai. I also highly encourage taking advantage of the free trial on Freedom Robotics to make the mapping process much easier.