Cargando…
Double-Camera Fusion System for Animal-Position Awareness in Farming Pens
In livestock breeding, continuous and objective monitoring of animals is manually unfeasible due to the large scale of breeding and expensive labour. Computer vision technology can generate accurate and real-time individual animal or animal group information from video surveillance. However, the fre...
Autores principales: | , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9818956/ https://www.ncbi.nlm.nih.gov/pubmed/36613301 http://dx.doi.org/10.3390/foods12010084 |
_version_ | 1784865112815828992 |
---|---|
author | Huo, Shoujun Sun, Yue Guo, Qinghua Tan, Tao Bolhuis, J. Elizabeth Bijma, Piter de With, Peter H. N. |
author_facet | Huo, Shoujun Sun, Yue Guo, Qinghua Tan, Tao Bolhuis, J. Elizabeth Bijma, Piter de With, Peter H. N. |
author_sort | Huo, Shoujun |
collection | PubMed |
description | In livestock breeding, continuous and objective monitoring of animals is manually unfeasible due to the large scale of breeding and expensive labour. Computer vision technology can generate accurate and real-time individual animal or animal group information from video surveillance. However, the frequent occlusion between animals and changes in appearance features caused by varying lighting conditions makes single-camera systems less attractive. We propose a double-camera system and image registration algorithms to spatially fuse the information from different viewpoints to solve these issues. This paper presents a deformable learning-based registration framework, where the input image pairs are initially linearly pre-registered. Then, an unsupervised convolutional neural network is employed to fit the mapping from one view to another, using a large number of unlabelled samples for training. The learned parameters are then used in a semi-supervised network and fine-tuned with a small number of manually annotated landmarks. The actual pixel displacement error is introduced as a complement to an image similarity measure. The performance of the proposed fine-tuned method is evaluated on real farming datasets and demonstrates significant improvement in lowering the registration errors than commonly used feature-based and intensity-based methods. This approach also reduces the registration time of an unseen image pair to less than 0.5 s. The proposed method provides a high-quality reference processing step for improving subsequent tasks such as multi-object tracking and behaviour recognition of animals for further analysis. |
format | Online Article Text |
id | pubmed-9818956 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-98189562023-01-07 Double-Camera Fusion System for Animal-Position Awareness in Farming Pens Huo, Shoujun Sun, Yue Guo, Qinghua Tan, Tao Bolhuis, J. Elizabeth Bijma, Piter de With, Peter H. N. Foods Article In livestock breeding, continuous and objective monitoring of animals is manually unfeasible due to the large scale of breeding and expensive labour. Computer vision technology can generate accurate and real-time individual animal or animal group information from video surveillance. However, the frequent occlusion between animals and changes in appearance features caused by varying lighting conditions makes single-camera systems less attractive. We propose a double-camera system and image registration algorithms to spatially fuse the information from different viewpoints to solve these issues. This paper presents a deformable learning-based registration framework, where the input image pairs are initially linearly pre-registered. Then, an unsupervised convolutional neural network is employed to fit the mapping from one view to another, using a large number of unlabelled samples for training. The learned parameters are then used in a semi-supervised network and fine-tuned with a small number of manually annotated landmarks. The actual pixel displacement error is introduced as a complement to an image similarity measure. The performance of the proposed fine-tuned method is evaluated on real farming datasets and demonstrates significant improvement in lowering the registration errors than commonly used feature-based and intensity-based methods. This approach also reduces the registration time of an unseen image pair to less than 0.5 s. The proposed method provides a high-quality reference processing step for improving subsequent tasks such as multi-object tracking and behaviour recognition of animals for further analysis. MDPI 2022-12-23 /pmc/articles/PMC9818956/ /pubmed/36613301 http://dx.doi.org/10.3390/foods12010084 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Huo, Shoujun Sun, Yue Guo, Qinghua Tan, Tao Bolhuis, J. Elizabeth Bijma, Piter de With, Peter H. N. Double-Camera Fusion System for Animal-Position Awareness in Farming Pens |
title | Double-Camera Fusion System for Animal-Position Awareness in Farming Pens |
title_full | Double-Camera Fusion System for Animal-Position Awareness in Farming Pens |
title_fullStr | Double-Camera Fusion System for Animal-Position Awareness in Farming Pens |
title_full_unstemmed | Double-Camera Fusion System for Animal-Position Awareness in Farming Pens |
title_short | Double-Camera Fusion System for Animal-Position Awareness in Farming Pens |
title_sort | double-camera fusion system for animal-position awareness in farming pens |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9818956/ https://www.ncbi.nlm.nih.gov/pubmed/36613301 http://dx.doi.org/10.3390/foods12010084 |
work_keys_str_mv | AT huoshoujun doublecamerafusionsystemforanimalpositionawarenessinfarmingpens AT sunyue doublecamerafusionsystemforanimalpositionawarenessinfarmingpens AT guoqinghua doublecamerafusionsystemforanimalpositionawarenessinfarmingpens AT tantao doublecamerafusionsystemforanimalpositionawarenessinfarmingpens AT bolhuisjelizabeth doublecamerafusionsystemforanimalpositionawarenessinfarmingpens AT bijmapiter doublecamerafusionsystemforanimalpositionawarenessinfarmingpens AT dewithpeterhn doublecamerafusionsystemforanimalpositionawarenessinfarmingpens |