Cargando…

A Deep Learning Model for Detecting Cage-Free Hens on the Litter Floor

SIMPLE SUMMARY: Real-time and automatic detection of chickens such as laying hens and broilers is the cornerstone of precision poultry farming. For laying hens, it is more challenging under cage-free conditions comparing to caged systems. In this study, we developed a deep learning model (YOLOv5x-he...

Descripción completa

Detalles Bibliográficos
Autores principales: Yang, Xiao, Chai, Lilong, Bist, Ramesh Bahadur, Subedi, Sachin, Wu, Zihao
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9367364/
https://www.ncbi.nlm.nih.gov/pubmed/35953972
http://dx.doi.org/10.3390/ani12151983
_version_ 1784765782074327040
author Yang, Xiao
Chai, Lilong
Bist, Ramesh Bahadur
Subedi, Sachin
Wu, Zihao
author_facet Yang, Xiao
Chai, Lilong
Bist, Ramesh Bahadur
Subedi, Sachin
Wu, Zihao
author_sort Yang, Xiao
collection PubMed
description SIMPLE SUMMARY: Real-time and automatic detection of chickens such as laying hens and broilers is the cornerstone of precision poultry farming. For laying hens, it is more challenging under cage-free conditions comparing to caged systems. In this study, we developed a deep learning model (YOLOv5x-hens) to monitor hens’ behaviors in cage-free facilities. More than 1000 images were used to train the model and an additional 200 images were adopted to test it. The newly developed YOLOv5x-hens was tested with stable performances in detecting birds under different lighting intensities, angles, and ages over 8 weeks. According to further data analysis, the model performed efficiently in real-time detection with an overall accuracy more than 95%, which is the key step for the tracking of individual birds for evaluation of production and welfare. However, there are still some limitations of the current version of the model. Error detections were caused by highly overlapped stock, uneven light intensity, and images occluded by equipment (i.e., drinking lines and feeders). Future research is needed to address those issues for a higher detection rate. The current study provides technical basis for developing a machine vision system for tracking individual birds for evaluation of animals’ behaviors and welfare status in commercial cage-free houses. ABSTRACT: Real-time and automatic detection of chickens (e.g., laying hens and broilers) is the cornerstone of precision poultry farming based on image recognition. However, such identification becomes more challenging under cage-free conditions comparing to caged hens. In this study, we developed a deep learning model (YOLOv5x-hens) based on YOLOv5, an advanced convolutional neural network (CNN), to monitor hens’ behaviors in cage-free facilities. More than 1000 images were used to train the model and an additional 200 images were adopted to test it. One-way ANOVA and Tukey HSD analyses were conducted using JMP software (JMP Pro 16 for Mac, SAS Institute, Cary, North Caronia) to determine whether there are significant differences between the predicted number of hens and the actual number of hens under various situations (i.e., age, light intensity, and observational angles). The difference was considered significant at p < 0.05. Our results show that the evaluation metrics (Precision, Recall, F1 and mAP@0.5) of the YOLOv5x-hens model were 0.96, 0.96, 0.96 and 0.95, respectively, in detecting hens on the litter floor. The newly developed YOLOv5x-hens was tested with stable performances in detecting birds under different lighting intensities, angles, and ages over 8 weeks (i.e., birds were 8–16 weeks old). For instance, the model was tested with 95% accuracy after the birds were 8 weeks old. However, younger chicks such as one-week old birds were harder to be tracked (e.g., only 25% accuracy) due to interferences of equipment such as feeders, drink lines, and perches. According to further data analysis, the model performed efficiently in real-time detection with an overall accuracy more than 95%, which is the key step for the tracking of individual birds for evaluation of production and welfare. However, there are some limitations of the current version of the model. Error detections came from highly overlapped stock, uneven light intensity, and images occluded by equipment (i.e., drinking line and feeder). Future research is needed to address those issues for a higher detection. The current study established a novel CNN deep learning model in research cage-free facilities for the detection of hens, which provides a technical basis for developing a machine vision system for tracking individual birds for evaluation of the animals’ behaviors and welfare status in commercial cage-free houses.
format Online
Article
Text
id pubmed-9367364
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-93673642022-08-12 A Deep Learning Model for Detecting Cage-Free Hens on the Litter Floor Yang, Xiao Chai, Lilong Bist, Ramesh Bahadur Subedi, Sachin Wu, Zihao Animals (Basel) Article SIMPLE SUMMARY: Real-time and automatic detection of chickens such as laying hens and broilers is the cornerstone of precision poultry farming. For laying hens, it is more challenging under cage-free conditions comparing to caged systems. In this study, we developed a deep learning model (YOLOv5x-hens) to monitor hens’ behaviors in cage-free facilities. More than 1000 images were used to train the model and an additional 200 images were adopted to test it. The newly developed YOLOv5x-hens was tested with stable performances in detecting birds under different lighting intensities, angles, and ages over 8 weeks. According to further data analysis, the model performed efficiently in real-time detection with an overall accuracy more than 95%, which is the key step for the tracking of individual birds for evaluation of production and welfare. However, there are still some limitations of the current version of the model. Error detections were caused by highly overlapped stock, uneven light intensity, and images occluded by equipment (i.e., drinking lines and feeders). Future research is needed to address those issues for a higher detection rate. The current study provides technical basis for developing a machine vision system for tracking individual birds for evaluation of animals’ behaviors and welfare status in commercial cage-free houses. ABSTRACT: Real-time and automatic detection of chickens (e.g., laying hens and broilers) is the cornerstone of precision poultry farming based on image recognition. However, such identification becomes more challenging under cage-free conditions comparing to caged hens. In this study, we developed a deep learning model (YOLOv5x-hens) based on YOLOv5, an advanced convolutional neural network (CNN), to monitor hens’ behaviors in cage-free facilities. More than 1000 images were used to train the model and an additional 200 images were adopted to test it. One-way ANOVA and Tukey HSD analyses were conducted using JMP software (JMP Pro 16 for Mac, SAS Institute, Cary, North Caronia) to determine whether there are significant differences between the predicted number of hens and the actual number of hens under various situations (i.e., age, light intensity, and observational angles). The difference was considered significant at p < 0.05. Our results show that the evaluation metrics (Precision, Recall, F1 and mAP@0.5) of the YOLOv5x-hens model were 0.96, 0.96, 0.96 and 0.95, respectively, in detecting hens on the litter floor. The newly developed YOLOv5x-hens was tested with stable performances in detecting birds under different lighting intensities, angles, and ages over 8 weeks (i.e., birds were 8–16 weeks old). For instance, the model was tested with 95% accuracy after the birds were 8 weeks old. However, younger chicks such as one-week old birds were harder to be tracked (e.g., only 25% accuracy) due to interferences of equipment such as feeders, drink lines, and perches. According to further data analysis, the model performed efficiently in real-time detection with an overall accuracy more than 95%, which is the key step for the tracking of individual birds for evaluation of production and welfare. However, there are some limitations of the current version of the model. Error detections came from highly overlapped stock, uneven light intensity, and images occluded by equipment (i.e., drinking line and feeder). Future research is needed to address those issues for a higher detection. The current study established a novel CNN deep learning model in research cage-free facilities for the detection of hens, which provides a technical basis for developing a machine vision system for tracking individual birds for evaluation of the animals’ behaviors and welfare status in commercial cage-free houses. MDPI 2022-08-05 /pmc/articles/PMC9367364/ /pubmed/35953972 http://dx.doi.org/10.3390/ani12151983 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Yang, Xiao
Chai, Lilong
Bist, Ramesh Bahadur
Subedi, Sachin
Wu, Zihao
A Deep Learning Model for Detecting Cage-Free Hens on the Litter Floor
title A Deep Learning Model for Detecting Cage-Free Hens on the Litter Floor
title_full A Deep Learning Model for Detecting Cage-Free Hens on the Litter Floor
title_fullStr A Deep Learning Model for Detecting Cage-Free Hens on the Litter Floor
title_full_unstemmed A Deep Learning Model for Detecting Cage-Free Hens on the Litter Floor
title_short A Deep Learning Model for Detecting Cage-Free Hens on the Litter Floor
title_sort deep learning model for detecting cage-free hens on the litter floor
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9367364/
https://www.ncbi.nlm.nih.gov/pubmed/35953972
http://dx.doi.org/10.3390/ani12151983
work_keys_str_mv AT yangxiao adeeplearningmodelfordetectingcagefreehensonthelitterfloor
AT chaililong adeeplearningmodelfordetectingcagefreehensonthelitterfloor
AT bistrameshbahadur adeeplearningmodelfordetectingcagefreehensonthelitterfloor
AT subedisachin adeeplearningmodelfordetectingcagefreehensonthelitterfloor
AT wuzihao adeeplearningmodelfordetectingcagefreehensonthelitterfloor
AT yangxiao deeplearningmodelfordetectingcagefreehensonthelitterfloor
AT chaililong deeplearningmodelfordetectingcagefreehensonthelitterfloor
AT bistrameshbahadur deeplearningmodelfordetectingcagefreehensonthelitterfloor
AT subedisachin deeplearningmodelfordetectingcagefreehensonthelitterfloor
AT wuzihao deeplearningmodelfordetectingcagefreehensonthelitterfloor