Cargando…

FedAAR: A Novel Federated Learning Framework for Animal Activity Recognition with Wearable Sensors

SIMPLE SUMMARY: Automated animal activity recognition has achieved great success due to the recent advances in deep learning, allowing staff to identify variations in the animal behavioural repertoire in real-time. The high performance of deep learning largely relies on the availability of big data,...

Descripción completa

Detalles Bibliográficos
Autores principales: Mao, Axiu, Huang, Endai, Gan, Haiming, Liu, Kai
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9404798/
https://www.ncbi.nlm.nih.gov/pubmed/36009732
http://dx.doi.org/10.3390/ani12162142
_version_ 1784773721996656640
author Mao, Axiu
Huang, Endai
Gan, Haiming
Liu, Kai
author_facet Mao, Axiu
Huang, Endai
Gan, Haiming
Liu, Kai
author_sort Mao, Axiu
collection PubMed
description SIMPLE SUMMARY: Automated animal activity recognition has achieved great success due to the recent advances in deep learning, allowing staff to identify variations in the animal behavioural repertoire in real-time. The high performance of deep learning largely relies on the availability of big data, which inevitably brings data privacy issues when collecting a centralised dataset from different farms. Federated learning provides a promising solution to train a shared model by coordinating multiple farms (clients) without sharing their private data, in which a global server periodically aggregates local (client) gradients to update the global model. The study develops a novel federated learning framework called FedAAR to achieve automated animal activity recognition using decentralised sensor data and to address, in particular, two major challenges resulting from data heterogeneity when applying federated learning in this task. The experiments demonstrate the performance advantages of FedAAR compared to the state-of-the-art, proving the promising capability of our framework for enhancing animal activity recognition performance. This research opens new opportunities for developing animal monitoring systems using decentralised data from multiple farms without privacy leakage. ABSTRACT: Deep learning dominates automated animal activity recognition (AAR) tasks due to high performance on large-scale datasets. However, constructing centralised data across diverse farms raises data privacy issues. Federated learning (FL) provides a distributed learning solution to train a shared model by coordinating multiple farms (clients) without sharing their private data, whereas directly applying FL to AAR tasks often faces two challenges: client-drift during local training and local gradient conflicts during global aggregation. In this study, we develop a novel FL framework called FedAAR to achieve AAR with wearable sensors. Specifically, we devise a prototype-guided local update module to alleviate the client-drift issue, which introduces a global prototype as shared knowledge to force clients to learn consistent features. To reduce gradient conflicts between clients, we design a gradient-refinement-based aggregation module to eliminate conflicting components between local gradients during global aggregation, thereby improving agreement between clients. Experiments are conducted on a public dataset to verify FedAAR’s effectiveness, which consists of 87,621 two-second accelerometer and gyroscope data. The results demonstrate that FedAAR outperforms the state-of-the-art, on precision (75.23%), recall (75.17%), F1-score (74.70%), and accuracy (88.88%), respectively. The ablation experiments show FedAAR’s robustness against various factors (i.e., data sizes, communication frequency, and client numbers).
format Online
Article
Text
id pubmed-9404798
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-94047982022-08-26 FedAAR: A Novel Federated Learning Framework for Animal Activity Recognition with Wearable Sensors Mao, Axiu Huang, Endai Gan, Haiming Liu, Kai Animals (Basel) Article SIMPLE SUMMARY: Automated animal activity recognition has achieved great success due to the recent advances in deep learning, allowing staff to identify variations in the animal behavioural repertoire in real-time. The high performance of deep learning largely relies on the availability of big data, which inevitably brings data privacy issues when collecting a centralised dataset from different farms. Federated learning provides a promising solution to train a shared model by coordinating multiple farms (clients) without sharing their private data, in which a global server periodically aggregates local (client) gradients to update the global model. The study develops a novel federated learning framework called FedAAR to achieve automated animal activity recognition using decentralised sensor data and to address, in particular, two major challenges resulting from data heterogeneity when applying federated learning in this task. The experiments demonstrate the performance advantages of FedAAR compared to the state-of-the-art, proving the promising capability of our framework for enhancing animal activity recognition performance. This research opens new opportunities for developing animal monitoring systems using decentralised data from multiple farms without privacy leakage. ABSTRACT: Deep learning dominates automated animal activity recognition (AAR) tasks due to high performance on large-scale datasets. However, constructing centralised data across diverse farms raises data privacy issues. Federated learning (FL) provides a distributed learning solution to train a shared model by coordinating multiple farms (clients) without sharing their private data, whereas directly applying FL to AAR tasks often faces two challenges: client-drift during local training and local gradient conflicts during global aggregation. In this study, we develop a novel FL framework called FedAAR to achieve AAR with wearable sensors. Specifically, we devise a prototype-guided local update module to alleviate the client-drift issue, which introduces a global prototype as shared knowledge to force clients to learn consistent features. To reduce gradient conflicts between clients, we design a gradient-refinement-based aggregation module to eliminate conflicting components between local gradients during global aggregation, thereby improving agreement between clients. Experiments are conducted on a public dataset to verify FedAAR’s effectiveness, which consists of 87,621 two-second accelerometer and gyroscope data. The results demonstrate that FedAAR outperforms the state-of-the-art, on precision (75.23%), recall (75.17%), F1-score (74.70%), and accuracy (88.88%), respectively. The ablation experiments show FedAAR’s robustness against various factors (i.e., data sizes, communication frequency, and client numbers). MDPI 2022-08-21 /pmc/articles/PMC9404798/ /pubmed/36009732 http://dx.doi.org/10.3390/ani12162142 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Mao, Axiu
Huang, Endai
Gan, Haiming
Liu, Kai
FedAAR: A Novel Federated Learning Framework for Animal Activity Recognition with Wearable Sensors
title FedAAR: A Novel Federated Learning Framework for Animal Activity Recognition with Wearable Sensors
title_full FedAAR: A Novel Federated Learning Framework for Animal Activity Recognition with Wearable Sensors
title_fullStr FedAAR: A Novel Federated Learning Framework for Animal Activity Recognition with Wearable Sensors
title_full_unstemmed FedAAR: A Novel Federated Learning Framework for Animal Activity Recognition with Wearable Sensors
title_short FedAAR: A Novel Federated Learning Framework for Animal Activity Recognition with Wearable Sensors
title_sort fedaar: a novel federated learning framework for animal activity recognition with wearable sensors
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9404798/
https://www.ncbi.nlm.nih.gov/pubmed/36009732
http://dx.doi.org/10.3390/ani12162142
work_keys_str_mv AT maoaxiu fedaaranovelfederatedlearningframeworkforanimalactivityrecognitionwithwearablesensors
AT huangendai fedaaranovelfederatedlearningframeworkforanimalactivityrecognitionwithwearablesensors
AT ganhaiming fedaaranovelfederatedlearningframeworkforanimalactivityrecognitionwithwearablesensors
AT liukai fedaaranovelfederatedlearningframeworkforanimalactivityrecognitionwithwearablesensors