Cargando…

Autonomous Crop Row Guidance Using Adaptive Multi-ROI in Strawberry Fields

Automated robotic platforms are an important part of precision agriculture solutions for sustainable food production. Agri-robots require robust and accurate guidance systems in order to navigate between crops and to and from their base station. Onboard sensors such as machine vision cameras offer a...

Descripción completa

Detalles Bibliográficos
Autores principales: Ponnambalam, Vignesh Raja, Bakken, Marianne, Moore, Richard J. D., Glenn Omholt Gjevestad, Jon, Johan From, Pål
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7571079/
https://www.ncbi.nlm.nih.gov/pubmed/32937939
http://dx.doi.org/10.3390/s20185249
_version_ 1783597093792776192
author Ponnambalam, Vignesh Raja
Bakken, Marianne
Moore, Richard J. D.
Glenn Omholt Gjevestad, Jon
Johan From, Pål
author_facet Ponnambalam, Vignesh Raja
Bakken, Marianne
Moore, Richard J. D.
Glenn Omholt Gjevestad, Jon
Johan From, Pål
author_sort Ponnambalam, Vignesh Raja
collection PubMed
description Automated robotic platforms are an important part of precision agriculture solutions for sustainable food production. Agri-robots require robust and accurate guidance systems in order to navigate between crops and to and from their base station. Onboard sensors such as machine vision cameras offer a flexible guidance alternative to more expensive solutions for structured environments such as scanning lidar or RTK-GNSS. The main challenges for visual crop row guidance are the dramatic differences in appearance of crops between farms and throughout the season and the variations in crop spacing and contours of the crop rows. Here we present a visual guidance pipeline for an agri-robot operating in strawberry fields in Norway that is based on semantic segmentation with a convolution neural network (CNN) to segment input RGB images into crop and not-crop (i.e., drivable terrain) regions. To handle the uneven contours of crop rows in Norway’s hilly agricultural regions, we develop a new adaptive multi-ROI method for fitting trajectories to the drivable regions. We test our approach in open-loop trials with a real agri-robot operating in the field and show that our approach compares favourably to other traditional guidance approaches.
format Online
Article
Text
id pubmed-7571079
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-75710792020-10-28 Autonomous Crop Row Guidance Using Adaptive Multi-ROI in Strawberry Fields Ponnambalam, Vignesh Raja Bakken, Marianne Moore, Richard J. D. Glenn Omholt Gjevestad, Jon Johan From, Pål Sensors (Basel) Article Automated robotic platforms are an important part of precision agriculture solutions for sustainable food production. Agri-robots require robust and accurate guidance systems in order to navigate between crops and to and from their base station. Onboard sensors such as machine vision cameras offer a flexible guidance alternative to more expensive solutions for structured environments such as scanning lidar or RTK-GNSS. The main challenges for visual crop row guidance are the dramatic differences in appearance of crops between farms and throughout the season and the variations in crop spacing and contours of the crop rows. Here we present a visual guidance pipeline for an agri-robot operating in strawberry fields in Norway that is based on semantic segmentation with a convolution neural network (CNN) to segment input RGB images into crop and not-crop (i.e., drivable terrain) regions. To handle the uneven contours of crop rows in Norway’s hilly agricultural regions, we develop a new adaptive multi-ROI method for fitting trajectories to the drivable regions. We test our approach in open-loop trials with a real agri-robot operating in the field and show that our approach compares favourably to other traditional guidance approaches. MDPI 2020-09-14 /pmc/articles/PMC7571079/ /pubmed/32937939 http://dx.doi.org/10.3390/s20185249 Text en © 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Ponnambalam, Vignesh Raja
Bakken, Marianne
Moore, Richard J. D.
Glenn Omholt Gjevestad, Jon
Johan From, Pål
Autonomous Crop Row Guidance Using Adaptive Multi-ROI in Strawberry Fields
title Autonomous Crop Row Guidance Using Adaptive Multi-ROI in Strawberry Fields
title_full Autonomous Crop Row Guidance Using Adaptive Multi-ROI in Strawberry Fields
title_fullStr Autonomous Crop Row Guidance Using Adaptive Multi-ROI in Strawberry Fields
title_full_unstemmed Autonomous Crop Row Guidance Using Adaptive Multi-ROI in Strawberry Fields
title_short Autonomous Crop Row Guidance Using Adaptive Multi-ROI in Strawberry Fields
title_sort autonomous crop row guidance using adaptive multi-roi in strawberry fields
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7571079/
https://www.ncbi.nlm.nih.gov/pubmed/32937939
http://dx.doi.org/10.3390/s20185249
work_keys_str_mv AT ponnambalamvigneshraja autonomouscroprowguidanceusingadaptivemultiroiinstrawberryfields
AT bakkenmarianne autonomouscroprowguidanceusingadaptivemultiroiinstrawberryfields
AT moorerichardjd autonomouscroprowguidanceusingadaptivemultiroiinstrawberryfields
AT glennomholtgjevestadjon autonomouscroprowguidanceusingadaptivemultiroiinstrawberryfields
AT johanfrompal autonomouscroprowguidanceusingadaptivemultiroiinstrawberryfields