Cargando…

Dead Laying Hens Detection Using TIR-NIR-Depth Images and Deep Learning on a Commercial Farm

SIMPLE SUMMARY: Timely detection of dead chickens is of great importance on commercial farms. Using multi-source images for dead chicken detection can theoretically achieve higher accuracy and robustness compared with single-source images. In this study, we introduced a pixel-level image registratio...

Descripción completa

Detalles Bibliográficos
Autores principales: Luo, Sheng, Ma, Yiming, Jiang, Feng, Wang, Hongying, Tong, Qin, Wang, Liangju
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10251900/
https://www.ncbi.nlm.nih.gov/pubmed/37889777
http://dx.doi.org/10.3390/ani13111861
_version_ 1785056042574413824
author Luo, Sheng
Ma, Yiming
Jiang, Feng
Wang, Hongying
Tong, Qin
Wang, Liangju
author_facet Luo, Sheng
Ma, Yiming
Jiang, Feng
Wang, Hongying
Tong, Qin
Wang, Liangju
author_sort Luo, Sheng
collection PubMed
description SIMPLE SUMMARY: Timely detection of dead chickens is of great importance on commercial farms. Using multi-source images for dead chicken detection can theoretically achieve higher accuracy and robustness compared with single-source images. In this study, we introduced a pixel-level image registration method to align the near-infrared (NIR), thermal infrared (TIR), and depth images and analyzed the detection performance of models using different source images. The results of the study showed the following: The model with the NIR image performed the best among models with single-source images, and the models with dual-source images performed better than that with single-source images. The model with the TIR-NIR image or the NIR-depth image performed better than the model with the TIR-depth image. The detection performance with the TIR-NIR-Depth image was better than that with single-source images but was not significantly different from that with the TIR-NIR or NIR-depth images. This study provided a reference for selecting and using multi-source images for detecting dead laying hens on commercial farms. ABSTRACT: In large-scale laying hen farming, timely detection of dead chickens helps prevent cross-infection, disease transmission, and economic loss. Dead chicken detection is still performed manually and is one of the major labor costs on commercial farms. This study proposed a new method for dead chicken detection using multi-source images and deep learning and evaluated the detection performance with different source images. We first introduced a pixel-level image registration method that used depth information to project the near-infrared (NIR) and depth image into the coordinate of the thermal infrared (TIR) image, resulting in registered images. Then, the registered single-source (TIR, NIR, depth), dual-source (TIR-NIR, TIR-depth, NIR-depth), and multi-source (TIR-NIR-depth) images were separately used to train dead chicken detecting models with object detection networks, including YOLOv8n, Deformable DETR, Cascade R-CNN, and TOOD. The results showed that, at an IoU (Intersection over Union) threshold of 0.5, the performance of these models was not entirely the same. Among them, the model using the NIR-depth image and Deformable DETR achieved the best performance, with an average precision (AP) of 99.7% (IoU = 0.5) and a recall of 99.0% (IoU = 0.5). While the IoU threshold increased, we found the following: The model with the NIR image achieved the best performance among models with single-source images, with an AP of 74.4% (IoU = 0.5:0.95) in Deformable DETR. The performance with dual-source images was higher than that with single-source images. The model with the TIR-NIR or NIR-depth image outperformed the model with the TIR-depth image, achieving an AP of 76.3% (IoU = 0.5:0.95) and 75.9% (IoU = 0.5:0.95) in Deformable DETR, respectively. The model with the multi-source image also achieved higher performance than that with single-source images. However, there was no significant improvement compared to the model with the TIR-NIR or NIR-depth image, and the AP of the model with multi-source image was 76.7% (IoU = 0.5:0.95) in Deformable DETR. By analyzing the detection performance with different source images, this study provided a reference for selecting and using multi-source images for detecting dead laying hens on commercial farms.
format Online
Article
Text
id pubmed-10251900
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-102519002023-06-10 Dead Laying Hens Detection Using TIR-NIR-Depth Images and Deep Learning on a Commercial Farm Luo, Sheng Ma, Yiming Jiang, Feng Wang, Hongying Tong, Qin Wang, Liangju Animals (Basel) Article SIMPLE SUMMARY: Timely detection of dead chickens is of great importance on commercial farms. Using multi-source images for dead chicken detection can theoretically achieve higher accuracy and robustness compared with single-source images. In this study, we introduced a pixel-level image registration method to align the near-infrared (NIR), thermal infrared (TIR), and depth images and analyzed the detection performance of models using different source images. The results of the study showed the following: The model with the NIR image performed the best among models with single-source images, and the models with dual-source images performed better than that with single-source images. The model with the TIR-NIR image or the NIR-depth image performed better than the model with the TIR-depth image. The detection performance with the TIR-NIR-Depth image was better than that with single-source images but was not significantly different from that with the TIR-NIR or NIR-depth images. This study provided a reference for selecting and using multi-source images for detecting dead laying hens on commercial farms. ABSTRACT: In large-scale laying hen farming, timely detection of dead chickens helps prevent cross-infection, disease transmission, and economic loss. Dead chicken detection is still performed manually and is one of the major labor costs on commercial farms. This study proposed a new method for dead chicken detection using multi-source images and deep learning and evaluated the detection performance with different source images. We first introduced a pixel-level image registration method that used depth information to project the near-infrared (NIR) and depth image into the coordinate of the thermal infrared (TIR) image, resulting in registered images. Then, the registered single-source (TIR, NIR, depth), dual-source (TIR-NIR, TIR-depth, NIR-depth), and multi-source (TIR-NIR-depth) images were separately used to train dead chicken detecting models with object detection networks, including YOLOv8n, Deformable DETR, Cascade R-CNN, and TOOD. The results showed that, at an IoU (Intersection over Union) threshold of 0.5, the performance of these models was not entirely the same. Among them, the model using the NIR-depth image and Deformable DETR achieved the best performance, with an average precision (AP) of 99.7% (IoU = 0.5) and a recall of 99.0% (IoU = 0.5). While the IoU threshold increased, we found the following: The model with the NIR image achieved the best performance among models with single-source images, with an AP of 74.4% (IoU = 0.5:0.95) in Deformable DETR. The performance with dual-source images was higher than that with single-source images. The model with the TIR-NIR or NIR-depth image outperformed the model with the TIR-depth image, achieving an AP of 76.3% (IoU = 0.5:0.95) and 75.9% (IoU = 0.5:0.95) in Deformable DETR, respectively. The model with the multi-source image also achieved higher performance than that with single-source images. However, there was no significant improvement compared to the model with the TIR-NIR or NIR-depth image, and the AP of the model with multi-source image was 76.7% (IoU = 0.5:0.95) in Deformable DETR. By analyzing the detection performance with different source images, this study provided a reference for selecting and using multi-source images for detecting dead laying hens on commercial farms. MDPI 2023-06-02 /pmc/articles/PMC10251900/ /pubmed/37889777 http://dx.doi.org/10.3390/ani13111861 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Luo, Sheng
Ma, Yiming
Jiang, Feng
Wang, Hongying
Tong, Qin
Wang, Liangju
Dead Laying Hens Detection Using TIR-NIR-Depth Images and Deep Learning on a Commercial Farm
title Dead Laying Hens Detection Using TIR-NIR-Depth Images and Deep Learning on a Commercial Farm
title_full Dead Laying Hens Detection Using TIR-NIR-Depth Images and Deep Learning on a Commercial Farm
title_fullStr Dead Laying Hens Detection Using TIR-NIR-Depth Images and Deep Learning on a Commercial Farm
title_full_unstemmed Dead Laying Hens Detection Using TIR-NIR-Depth Images and Deep Learning on a Commercial Farm
title_short Dead Laying Hens Detection Using TIR-NIR-Depth Images and Deep Learning on a Commercial Farm
title_sort dead laying hens detection using tir-nir-depth images and deep learning on a commercial farm
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10251900/
https://www.ncbi.nlm.nih.gov/pubmed/37889777
http://dx.doi.org/10.3390/ani13111861
work_keys_str_mv AT luosheng deadlayinghensdetectionusingtirnirdepthimagesanddeeplearningonacommercialfarm
AT mayiming deadlayinghensdetectionusingtirnirdepthimagesanddeeplearningonacommercialfarm
AT jiangfeng deadlayinghensdetectionusingtirnirdepthimagesanddeeplearningonacommercialfarm
AT wanghongying deadlayinghensdetectionusingtirnirdepthimagesanddeeplearningonacommercialfarm
AT tongqin deadlayinghensdetectionusingtirnirdepthimagesanddeeplearningonacommercialfarm
AT wangliangju deadlayinghensdetectionusingtirnirdepthimagesanddeeplearningonacommercialfarm