Cargando…

Assessment of Mixed Sward Using Context Sensitive Convolutional Neural Networks

Breeding higher yielding forage species is limited by current manual harvesting and visual scoring techniques used for measuring or estimation of biomass. Automation and remote sensing for high throughput phenotyping has been used in recent years as a viable solution to this bottleneck. Here, we foc...

Descripción completa

Detalles Bibliográficos
Autores principales: Bateman, Christopher J., Fourie, Jaco, Hsiao, Jeffrey, Irie, Kenji, Heslop, Angus, Hilditch, Anthony, Hagedorn, Michael, Jessep, Bruce, Gebbie, Steve, Ghamkhar, Kioumars
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7056886/
https://www.ncbi.nlm.nih.gov/pubmed/32174941
http://dx.doi.org/10.3389/fpls.2020.00159
Descripción
Sumario:Breeding higher yielding forage species is limited by current manual harvesting and visual scoring techniques used for measuring or estimation of biomass. Automation and remote sensing for high throughput phenotyping has been used in recent years as a viable solution to this bottleneck. Here, we focus on using RGB imaging and deep learning for white clover (Trifolium repens L.) and perennial ryegrass (Lolium perenne L.) yield estimation in a mixed sward. We present a new convolutional neural network (CNN) architecture designed for semantic segmentation of dense pasture and canopies with high occlusion to which we have named the local context network (LC-Net). On our testing data set we obtain a mean accuracy of 95.4% and a mean intersection over union of 81.3%, outperforming other methods we have found in the literature for segmenting clover from ryegrass. Comparing the clover/vegetation fraction for visual coverage and harvested dry-matter however showed little improvement from the segmentation accuracy gains. Further gains in biomass estimation accuracy may be achievable through combining RGB with complimentary information such as volumetric data from other sensors, which will form the basis of our future work.