Cargando…
Fully convolutional network for rice seedling and weed image segmentation at the seedling stage in paddy fields
To reduce the cost of production and the pollution of the environment that is due to the overapplication of herbicide in paddy fields, the location information of rice seedlings and weeds must be detected in site-specific weed management (SSWM). With the development of deep learning, a semantic segm...
Autores principales: | , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2019
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6472823/ https://www.ncbi.nlm.nih.gov/pubmed/30998770 http://dx.doi.org/10.1371/journal.pone.0215676 |
_version_ | 1783412319175311360 |
---|---|
author | Ma, Xu Deng, Xiangwu Qi, Long Jiang, Yu Li, Hongwei Wang, Yuwei Xing, Xupo |
author_facet | Ma, Xu Deng, Xiangwu Qi, Long Jiang, Yu Li, Hongwei Wang, Yuwei Xing, Xupo |
author_sort | Ma, Xu |
collection | PubMed |
description | To reduce the cost of production and the pollution of the environment that is due to the overapplication of herbicide in paddy fields, the location information of rice seedlings and weeds must be detected in site-specific weed management (SSWM). With the development of deep learning, a semantic segmentation method with the SegNet that is based on fully convolutional network (FCN) was proposed. In this paper, RGB color images of seedling rice were captured in paddy field, and ground truth (GT) images were obtained by manually labeled the pixels in the RGB images with three separate categories, namely, rice seedlings, background, and weeds. The class weight coefficients were calculated to solve the problem of the unbalance of the number of the classification category. GT images and RGB images were used for data training and data testing. Eighty percent of the samples were randomly selected as the training dataset and 20% of samples were used as the test dataset. The proposed method was compared with a classical semantic segmentation model, namely, FCN, and U-Net models. The average accuracy rate of the SegNet method was 92.7%, whereas the average accuracy rates of the FCN and U-Net methods were 89.5% and 70.8%, respectively. The proposed SegNet method realized higher classification accuracy and could effectively classify the pixels of rice seedlings, background, and weeds in the paddy field images and acquire the positions of their regions. |
format | Online Article Text |
id | pubmed-6472823 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2019 |
publisher | Public Library of Science |
record_format | MEDLINE/PubMed |
spelling | pubmed-64728232019-05-03 Fully convolutional network for rice seedling and weed image segmentation at the seedling stage in paddy fields Ma, Xu Deng, Xiangwu Qi, Long Jiang, Yu Li, Hongwei Wang, Yuwei Xing, Xupo PLoS One Research Article To reduce the cost of production and the pollution of the environment that is due to the overapplication of herbicide in paddy fields, the location information of rice seedlings and weeds must be detected in site-specific weed management (SSWM). With the development of deep learning, a semantic segmentation method with the SegNet that is based on fully convolutional network (FCN) was proposed. In this paper, RGB color images of seedling rice were captured in paddy field, and ground truth (GT) images were obtained by manually labeled the pixels in the RGB images with three separate categories, namely, rice seedlings, background, and weeds. The class weight coefficients were calculated to solve the problem of the unbalance of the number of the classification category. GT images and RGB images were used for data training and data testing. Eighty percent of the samples were randomly selected as the training dataset and 20% of samples were used as the test dataset. The proposed method was compared with a classical semantic segmentation model, namely, FCN, and U-Net models. The average accuracy rate of the SegNet method was 92.7%, whereas the average accuracy rates of the FCN and U-Net methods were 89.5% and 70.8%, respectively. The proposed SegNet method realized higher classification accuracy and could effectively classify the pixels of rice seedlings, background, and weeds in the paddy field images and acquire the positions of their regions. Public Library of Science 2019-04-18 /pmc/articles/PMC6472823/ /pubmed/30998770 http://dx.doi.org/10.1371/journal.pone.0215676 Text en © 2019 Ma et al http://creativecommons.org/licenses/by/4.0/ This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. |
spellingShingle | Research Article Ma, Xu Deng, Xiangwu Qi, Long Jiang, Yu Li, Hongwei Wang, Yuwei Xing, Xupo Fully convolutional network for rice seedling and weed image segmentation at the seedling stage in paddy fields |
title | Fully convolutional network for rice seedling and weed image segmentation at the seedling stage in paddy fields |
title_full | Fully convolutional network for rice seedling and weed image segmentation at the seedling stage in paddy fields |
title_fullStr | Fully convolutional network for rice seedling and weed image segmentation at the seedling stage in paddy fields |
title_full_unstemmed | Fully convolutional network for rice seedling and weed image segmentation at the seedling stage in paddy fields |
title_short | Fully convolutional network for rice seedling and weed image segmentation at the seedling stage in paddy fields |
title_sort | fully convolutional network for rice seedling and weed image segmentation at the seedling stage in paddy fields |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6472823/ https://www.ncbi.nlm.nih.gov/pubmed/30998770 http://dx.doi.org/10.1371/journal.pone.0215676 |
work_keys_str_mv | AT maxu fullyconvolutionalnetworkforriceseedlingandweedimagesegmentationattheseedlingstageinpaddyfields AT dengxiangwu fullyconvolutionalnetworkforriceseedlingandweedimagesegmentationattheseedlingstageinpaddyfields AT qilong fullyconvolutionalnetworkforriceseedlingandweedimagesegmentationattheseedlingstageinpaddyfields AT jiangyu fullyconvolutionalnetworkforriceseedlingandweedimagesegmentationattheseedlingstageinpaddyfields AT lihongwei fullyconvolutionalnetworkforriceseedlingandweedimagesegmentationattheseedlingstageinpaddyfields AT wangyuwei fullyconvolutionalnetworkforriceseedlingandweedimagesegmentationattheseedlingstageinpaddyfields AT xingxupo fullyconvolutionalnetworkforriceseedlingandweedimagesegmentationattheseedlingstageinpaddyfields |