The core of Professor Takuya Fujinaga’s research is to deeply integrate LiDAR technology with the agricultural environment, so that robots can complete high-precision, autonomous navigation and crop harvesting in the elevated bed cultivation system. The system consists of three core modules: environmental perception and map construction, path planning and tracking control, positioning and dynamic obstacle avoidance, and the system has been tested in virtual simulation and actual greenhouse environments.
First, the robot scans the surrounding environment in real time through the installed LiDAR sensor to generate dense point cloud data. After filtering, clustering and feature extraction, the point cloud is converted into a two-dimensional or three-dimensional map, which can clearly mark key elements such as the edge of the cultivation bed, the operation channel, and obstacles. This provides high-precision environmental input for the navigation algorithm, giving the robot a spatial perception capability similar to “vision”.
In terms of path planning, the system combines global path planning with local real-time obstacle avoidance strategies. The global path mostly uses A* or Dijkstra algorithm to calculate the optimal path from the current position of the robot to the target point; during the execution process, the local path adjustment uses the dynamic window method (DWA) or model predictive control (MPC), which can dynamically adjust the travel trajectory according to the real-time environment to ensure that the robot can stably and smoothly travel between the elevated ridges.
Especially in the tests of “straight cultivation ridges” and “curve or corner scenes”, the navigation system performed well, and was able to control the distance deviation from the edge of the ridge within 10 cm, effectively reducing the risk of collision or offset. This accuracy is especially important for high-value-added crops such as strawberries and tomatoes that need to be harvested accurately.
At the same time, the system uses positioning technology based on laser SLAM (Simultaneous Localization and Mapping), which can build maps in real time and accurately estimate the position and posture of the robot in a closed environment without GPS signals. This adaptive positioning method makes it particularly suitable for semi-structured agricultural scenarios such as greenhouses.
Professor Fujinaga pointed out: “We not only developed a set of algorithms, but also tried to establish a framework that enables agricultural robots to ‘understand’ the farming environment and respond intelligently in it. If robots can move more accurately in the fields, they will be able to perform tasks far beyond harvesting, including disease monitoring, pruning, spraying and other operations.”
This study verified the stability and scalability of the system and was published in the journal “Computers and Electronics in Agriculture”, providing important technical support for the construction of smart farms and taking a key step towards the practical application of agricultural robots.
Reference: https://www.asiaresearchnews.com/content/farm-robot-autonomously-navigates-harvests-among-raised-beds