Cargando…
Research on maize canopy center recognition based on nonsignificant color difference segmentation
Weed control is a substantial challenge in field management. A better weed control method at an earlier growth stage is important for increasing yields. As a promising weed control technique, intelligent weeding based on machine vision can avoid the harm of chemical weeding. For machine vision, it i...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2018
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6160289/ https://www.ncbi.nlm.nih.gov/pubmed/30261517 http://dx.doi.org/10.1371/journal.pone.0202366 |
Sumario: | Weed control is a substantial challenge in field management. A better weed control method at an earlier growth stage is important for increasing yields. As a promising weed control technique, intelligent weeding based on machine vision can avoid the harm of chemical weeding. For machine vision, it is critical to extract and segment crops from their background. However, there is still no optimal solution for object tracking with occlusion under a similar color background. In this study, it was found that the gray distribution of a maize canopy follows the gradient law. Therefore, the recognition method based on the HLS-SVM (HLS color space and Support Vector Machine) and on the grayscale gradient was developed. First, the HLS color space was used to detect the maize canopy. Second, the SVM method was used to segment the central region of the maize canopy. Finally, the maize canopy center was identified according to the gradient law. The results showed that the average segmentation time was 0.49 s, the average segmentation quality was 87.25%, and the standard deviation of the segmentation was 3.57%. The average recognition rate of the center position was 93.33%. This study provided a machine vision method for intelligent weeding agricultural equipment as well as a theoretical reference for further agricultural machine vision research. |
---|