Hi, I recently acquired a myAGV Jetson Nano, the upgraded version of myAGV by Elephant Robotics. Lately, I have been interested in SLAM (Simultaneous Localization and Mapping) and wanted to delve deeper into some algorithms and planning related to SLAM in ROS (Robot Operating System). According to the official GitBook provided, the main algorithm used for mapping and navigation is the gmapping algorithm.
This article primarily explores how to enhance the precision of myAGV by adjusting the gmapping algorithm and making other software optimizations, without relying on hardware upgrades.
Product IntroductionmyAGV-Jetson NanoThis is the second generation of myAGV. The first generation only had a Raspberry Pi 4B version, which lacked computational power. The second generation introduced a version controlled by Jetson Nano, which can meet most ROS requirements. Jetson Nano is capable of fulfilling most computational needs for embedded robots.
Compared to the previous version, this one comes with many additional accessories. It can be equipped with a 3D camera for visual mapping and has a display screen installed on the body for easier operation. Additionally, an extra energy storage battery has been added, making it more durable than before (the previous version would run out of power in about an hour). Having seen the reviews and usage of the first Raspberry Pi version, I have high expectations for this upgraded version.
It is equipped with a radar, a high-performance planetary DC brushless motor, and retains competition-grade omnidirectional wheels. On top of the original features, it has opened up a Python control interface and supports graphical programming and other software. What excites me the most is that the main control board offers powerful graphic processing capabilities and also supports 3D mapping and navigation.
NVIDIA Jetson Nano B01 is a small yet powerful embedded computing development board designed for artificial intelligence (AI) and machine learning (ML) applications.
Key Specifications:
● NVIDIA Maxwell architecture GPU with 128 CUDA cores.
● Quad-core ARM Cortex-A57 CPU with a clock speed of 1.43 GHz.
● 4GB LPDDR4 memory with a 64-bit interface and a frequency of 1600 MHz.
Based on these specifications, Jetson Nano B01 is suitable for various AI and embedded application scenarios, particularly in robotics for tasks such as autonomous navigation, motion control, and path planning.
GmappingGmapping is a commonly used SLAM (Simultaneous Localization and Mapping) algorithm. It uses a particle filter method to simultaneously build an environment map and estimate the robot's position and posture while the robot is moving.
Upon starting, the interface is the Ubuntu system, which facilitates operations using ROS (Robot Operating System). As far as I know, the host already has some basic mapping information pre-configured.
For first-time users of such robots, it is very user-friendly, providing a UI interface where you can simply click on what you need help with.
With just a few clicks, within minutes, you can activate the radar and start running gmapping for environment mapping.
The UI interface is very user-friendly and relatively comprehensive in terms of basic mapping and navigation functions. However, it is primarily useful for beginners who want to get started quickly and not for those looking to develop other types of projects.
Afterwards, you can start mapping. Using VNC for remote connection, you can control myAGV with a keyboard to map the environment you want to navigate.
So far, everything has been smooth. If you prefer not to use the UI, you can also manually enter command lines to execute environment mapping functions. The following commands are all pre-packaged by Elephant Robotics and need to be run in the command line.
# start lidar
roslaunch myagv_odometry myagv_active.launch
#launch gmapping file.
roslaunch myagv_navigation myagv_slam_laser.launch
# lanuch teleop control
roslaunch myagv_teleop myagv_teleop.launch
#save map
rosrun map_server map_saver
Next, we proceed with the navigation function. First, change the path of the saved map in the navigation package.
Close the mapping terminal and run the navigation command.
roslaunch myagv_navigation navigation_active.launch
At this point, it is crucial to place the myAGV at the initial position where the robot started during the mapping process, or adjust it in RViz to ensure that the position of myAGV on the map matches its actual position in the environment. This alignment is necessary to ensure accurate navigation to the desired destination.
Click on "2D Pose Estimate" in the top toolbar to make adjustments, ensuring the car in the RViz interface corresponds with the physical car. The terminal will then return the coordinates and heading angle of the car relative to the map.
Distributed navigation can also be implemented by recording the parameters of the navigation points, including the x, y coordinates and the yaw angle.
However, during the navigation process, some issues were observed. A key problem is the lack of precision. Out of 10 navigation attempts, each one deviated slightly from the original path.
Identified Issues and SolutionsWhy Does Deviation Occur?There are two main reasons for the deviation problem:
1. Sensor Errors: Inaccuracies in the hardware sensors.
2. Algorithm Limitations: Constraints inherent in the software algorithms.
SolutionsBelow are the methods I used to address these issues, based on parameters provided by ROS's official documentation: ROS Gmapping
1. Adjusting Radar Parameters
a. maxRange and maxUrange
i. maxRange: Set the maximum detection distance of the laser radar. Ensure this value matches the actual measurement range of the laser radar.
ii. maxUrange: Set the maximum effective distance for building the map. This value is generally slightly less than maxRange and should be set to a reasonable value based on the actual measurement distance.
b. sigma
i. Represents the standard deviation of the laser radar measurements. A smaller value indicates more precise measurements. Adjust this value based on the actual performance of the laser radar to reduce the impact of measurement noise.
c. kernelSize
i. Indicates the window size for scan matching. Larger values can increase the robustness of matching but also increase computation. Adjust this parameter to balance computation time and matching accuracy.
d. lstep and astep
i. lstep: Linear step size, representing the size of the translation step during scan matching.
ii. astep: Angular step size, representing the size of the rotation step during scan matching. Reducing these steps can improve scan matching accuracy but will also increase the computational burden.
e. particles
i. Number of particles. More particles can improve the accuracy and stability of localization but will also increase computational load. Increase the number of particles as appropriate, within the limits of available computational resources.
f. xmin, ymin, xmax, ymax
i. Set the boundaries of the map to ensure these values cover the entire area where the robot operates. Adjusting the map boundaries appropriately can reduce the calculation of invalid areas and improve overall efficiency.
<launch>
<arg name="scan_topic" default="scan" />
<node pkg="gmapping" type="slam_gmapping" name="gmapping" output="screen" clear_params="true">
<param name="base_frame" value="base_footprint"/>
<param name="odom_frame" value="odom"/>
<!--param name="odom_frame" value="odom_combined"/-->
<param name="map_update_interval" value="0.1"/>
<!-- Set maxUrange < actual maximum range of the Laser -->
<param name="maxRange" value="5.0"/>
<param name="maxUrange" value="4.5"/>
<param name="sigma" value="0.05"/>
<param name="kernelSize" value="1"/>
<param name="lstep" value="0.05"/>
<param name="astep" value="0.05"/>
<param name="iterations" value="5"/>
<param name="lsigma" value="0.075"/>
<param name="ogain" value="3.0"/>
<param name="lskip" value="0"/>
<param name="srr" value="0.01"/>
<param name="srt" value="0.02"/>
<param name="str" value="0.01"/>
<param name="stt" value="0.02"/>
<param name="linearUpdate" value="0.5"/>
<param name="angularUpdate" value="0.436"/>
<param name="temporalUpdate" value="-1.0"/>
<param name="resampleThreshold" value="0.5"/>
<param name="particles" value="80"/>
<param name="xmin" value="-1.0"/>
<param name="ymin" value="-1.0"/>
<param name="xmax" value="1.0"/>
<param name="ymax" value="1.0"/>
<param name="delta" value="0.05"/>
<param name="llsamplerange" value="0.01"/>
<param name="llsamplestep" value="0.01"/>
<param name="lasamplerange" value="0.005"/>
<param name="lasamplestep" value="0.005"/>
<remap from="scan" to="$(arg scan_topic)"/>
</node>
</launch>
2. Adjusting Odometry Parameters
Odometry Model:● Calibrate Odometry Model Parameters: Ensure that these parameters accurately reflect the robot's motion characteristics.
● Check and Adjust Wheel Radius and Axle Distance: Reduce model errors by ensuring these parameters are precise.
Sensor Fusion:● Combine IMU Data: Use methods such as the Extended Kalman Filter (EKF) for sensor data fusion to improve localization accuracy.
● Ensure Time Synchronization of Odometry and IMU Data: Reduce timing errors by synchronizing the data from the odometer and IMU.
3. Gmapping Algorithm Adjustments:● Particle Filter Error: Gmapping uses a particle filter algorithm, and the number and distribution of particles affect accuracy. Insufficient particle numbers or unreasonable distribution may lead to errors.
● Parameter Settings in the Gmapping Algorithm: Parameters such as the number of particles, step size, and noise model settings impact localization and mapping accuracy. Adjusting these parameters can improve the SLAM algorithm's accuracy.
By tuning these parameters, you can ensure that the myAGV operates with minimal errors. It is essential to adjust parameters based on the surrounding environment and conduct extensive testing to ensure precise mapping. Only with precise mapping can accurate navigation be ensured.
SummaryOverall, I find that the myAGV performs quite well. Whether in terms of performance, user experience, or the provided documentation, it is very user-friendly and easy for beginners to get started with. Currently, I am continuously using and familiarizing myself with this product. In the future, I plan to work on some interesting projects, hoping to fully leverage the potential of the Jetson Nano B01 by integrating AI and large models.
If you have any good suggestions, feel free to share them!
Comments
Please log in or sign up to comment.