Development of a vision-based local positioning system for weed detection
Herbicides applications could possibly be reduced if targeted. Targeting the applications requires prior identification and quantification of the weed population. This task could possibly be done by a weed scout robot. The ability to position a camera over the inter-row space of densely seeded crops will help to simplify the task of automatically quantifying weed infestations. As part of the development of an autonomous weed scout, a vision-based local positioning system for weed detection has been developed and tested in a laboratory setting. Four Line-detection algorithms have been tested and a robotic positioning device, or XYZtheta-table, was developed and tested. The Line-detection algorithms were based respectively on a stripe analysis, a blob analysis, a linear regression and the Hough Transform. The last two also included an edge-detection step. Images of parallel line patterns representing crop rows were collected at different angles, with and without weed-simulating noise. The images were processed by the four programs. The ability of the programs to determine the angle of the rows and the location of an inter-row space centreline was evaluated in a laboratory setting. All algorithms behaved approximately the same when determining the rows’ angle in the noise-free images, with a mean error of 0.5°. In the same situation, all algorithms could find the centreline of an inter-row space within 2.7 mm. Generally, the mean errors increased when noise was added to the images, up to 1.1° and 8.5 mm for the Linear Regression algorithm. Specific dispersions of the weeds were identified as possible causes of increase of the error in noisy images. Because of its insensitivity to noise, the Stripe Analysis algorithm was considered the best overall. The fastest program was the Blob Analysis algorithm with a mean processing time of 0.35 s per image. Future work involves evaluation of the Line-detection algorithms with field images. The XYZtheta-table consisted of rails allowing movement of a camera in the 3 orthogonal directions and of a rotational table that could rotate the camera about a vertical axis. The ability of the XYZtheta-table to accurately move the camera within the XY-space and rotate it at a desired angle was evaluated in a laboratory setting. The XYZtheta-table was able to move the camera within 7 mm of a target and to rotate it with a mean error of 0.07°. The positioning accuracy could be improved by simple mechanical modifications on the XYZtheta-table.
DegreeMaster of Science (M.Sc.)
DepartmentAgricultural and Bioresource Engineering
ProgramAgricultural and Bioresource Engineering
SupervisorCrowe, Trever G.
CommitteeGuo, Huiqing; Bolton, Ronald J.; Roberge, Martin
Copyright DateApril 2004
crop row detection