Accurate row detection is crucial for autonomous navigation systems in agricultural applications. Since the crops are typically planted in evenly spaced, parallel rows, crop rows mathematically correlate to a specific spectral signature within the frequency domain. In this study, we propose a robust frequency domain-based image processing pipeline for accurately locating crop rows within images. Using the two-dimensional discrete Fourier transform (DFT), we identify the spacing and direction of crop rows from significant frequency components, and their positions from the phase. To enhance precision, we apply frequency domain interpolation, a Hanning window, and discrete-time Fourier transform (DTFT) to accurately locate peak responses and corresponding phases in the spectrum. Extensive experiments were conducted to validate the accuracy and efficiency of our algorithm. The algorithm accurately detected crop rows in images and estimated lateral position and heading deviations relative to row centerlines, across different plant growth stages and illumination conditions. Additionally, by integrating a linear quadratic Gaussian (LQG) navigation controller with this vision-based crop row detection algorithm, we successfully enabled a robot to follow crop row centerlines with a mean absolute tracking error of 3.74 cm. These results highlight the potential of our frequency domain-based approach for enhancing precision agriculture through robust crop row detection and navigation.
Read full abstract