Automating Small-Scale Farms

New tech helps farm robots navigate identical crop rows without GPS or extra gear — just LiDAR and a smart navigation strategy.

Nick Bild
2 months agoRobotics
This robot is harvesting strawberries (📷: T. Fujinaga)

If you have ever taken a stroll around a farming operation, you know that once you have seen a small section of the fields, you have seen them all. Row after row of crops looks virtually identical. This is by design, of course, because the farmers are interested in growing a specific crop, and they do not want anything else working its way into the mix. Furthermore, the regularity of the rows of crops makes it easier for the heavy equipment that is used during harvesting to make its way through the fields.

When it comes to adding computer vision- or LiDAR-based robots into the mix to help out with caring for or harvesting the crops, that visual monotony can be a problem, however. Since everything looks the same, these robots have a difficult time getting their bearings, which prevents them from navigating through the fields. That may not be so much of a problem in the future, because a researcher at the Osaka Metropolitan University in Japan has developed a novel autonomous navigation strategy to help these robots find their way — without requiring additional hardware.

The work focuses on agricultural robots operating in high-bed cultivation environments, like those found in strawberry greenhouses. These environments are especially challenging because of their narrow, cluttered spaces and visually repetitive structures. Traditional path-planning methods often rely on precise localization or pre-mapped paths, but those approaches fall short in dynamic, small-scale farm setups. Instead, the new method uses a hybrid approach that combines waypoint navigation — which directs the robot toward a predefined destination — and cultivation bed navigation, where the robot follows the layout of the planting beds using just LiDAR data.

The navigation system was first tested in a simulated environment to make sure it was ready for use before being deployed to an actual strawberry farm. The ultimate deployment was on a robot that uses a 2D LiDAR sensor and a tracking camera to detect and follow the cultivation beds. It was found that by using this approach, the robot could maintain a precise distance and orientation — within ±0.05 meters and ±5 degrees — even as conditions changed. This accuracy allowed the robot to move autonomously between the beds without damaging crops or needing human intervention for navigation.

The robot is a versatile unit, designed with a modular base and interchangeable application modules for harvesting or pruning. Compact and crawler-driven for maneuvering on uneven terrain, the robot was built with small- and mid-scale farming in mind. These farms often struggle to adopt automation due to the high cost and complexity of existing systems. By eliminating the need for GPS or additional localization markers, this navigation approach opens the door for broader adoption of robotics in farming.

In the future, there are plans to further refine the system by developing dynamic simulation environments that mimic real-world challenges such as uneven terrain and shifting farm layouts. These virtual tests will help accelerate improvements in robot design and performance, moving agriculture one step closer to being fully automated.

Nick Bild
R&D, creativity, and building the next big thing you never knew you wanted are my specialties.
Latest articles
Sponsored articles
Related articles
Get our weekly newsletter when you join Hackster.
Latest articles
Read more
Related articles