Cargando…

ANIMAL-SPOT enables animal-independent signal detection and classification using deep learning

Bioacoustic research spans a wide range of biological questions and applications, relying on identification of target species or smaller acoustic units, such as distinct call types. However, manually identifying the signal of interest is time-intensive, error-prone, and becomes unfeasible with large...

Descripción completa

Detalles Bibliográficos
Autores principales: Bergler, Christian, Smeele, Simeon Q., Tyndel, Stephen A., Barnhill, Alexander, Ortiz, Sara T., Kalan, Ammie K., Cheng, Rachael Xi, Brinkløv, Signe, Osiecka, Anna N., Tougaard, Jakob, Jakobsen, Freja, Wahlberg, Magnus, Nöth, Elmar, Maier, Andreas, Klump, Barbara C.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9763499/
https://www.ncbi.nlm.nih.gov/pubmed/36535999
http://dx.doi.org/10.1038/s41598-022-26429-y
_version_ 1784853074689392640
author Bergler, Christian
Smeele, Simeon Q.
Tyndel, Stephen A.
Barnhill, Alexander
Ortiz, Sara T.
Kalan, Ammie K.
Cheng, Rachael Xi
Brinkløv, Signe
Osiecka, Anna N.
Tougaard, Jakob
Jakobsen, Freja
Wahlberg, Magnus
Nöth, Elmar
Maier, Andreas
Klump, Barbara C.
author_facet Bergler, Christian
Smeele, Simeon Q.
Tyndel, Stephen A.
Barnhill, Alexander
Ortiz, Sara T.
Kalan, Ammie K.
Cheng, Rachael Xi
Brinkløv, Signe
Osiecka, Anna N.
Tougaard, Jakob
Jakobsen, Freja
Wahlberg, Magnus
Nöth, Elmar
Maier, Andreas
Klump, Barbara C.
author_sort Bergler, Christian
collection PubMed
description Bioacoustic research spans a wide range of biological questions and applications, relying on identification of target species or smaller acoustic units, such as distinct call types. However, manually identifying the signal of interest is time-intensive, error-prone, and becomes unfeasible with large data volumes. Therefore, machine-driven algorithms are increasingly applied to various bioacoustic signal identification challenges. Nevertheless, biologists still have major difficulties trying to transfer existing animal- and/or scenario-related machine learning approaches to their specific animal datasets and scientific questions. This study presents an animal-independent, open-source deep learning framework, along with a detailed user guide. Three signal identification tasks, commonly encountered in bioacoustics research, were investigated: (1) target signal vs. background noise detection, (2) species classification, and (3) call type categorization. ANIMAL-SPOT successfully segmented human-annotated target signals in data volumes representing 10 distinct animal species and 1 additional genus, resulting in a mean test accuracy of 97.9%, together with an average area under the ROC curve (AUC) of 95.9%, when predicting on unseen recordings. Moreover, an average segmentation accuracy and F1-score of 95.4% was achieved on the publicly available BirdVox-Full-Night data corpus. In addition, multi-class species and call type classification resulted in 96.6% and 92.7% accuracy on unseen test data, as well as 95.2% and 88.4% regarding previous animal-specific machine-based detection excerpts. Furthermore, an Unweighted Average Recall (UAR) of 89.3% outperformed the multi-species classification baseline system of the ComParE 2021 Primate Sub-Challenge. Besides animal independence, ANIMAL-SPOT does not rely on expert knowledge or special computing resources, thereby making deep-learning-based bioacoustic signal identification accessible to a broad audience.
format Online
Article
Text
id pubmed-9763499
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-97634992022-12-21 ANIMAL-SPOT enables animal-independent signal detection and classification using deep learning Bergler, Christian Smeele, Simeon Q. Tyndel, Stephen A. Barnhill, Alexander Ortiz, Sara T. Kalan, Ammie K. Cheng, Rachael Xi Brinkløv, Signe Osiecka, Anna N. Tougaard, Jakob Jakobsen, Freja Wahlberg, Magnus Nöth, Elmar Maier, Andreas Klump, Barbara C. Sci Rep Article Bioacoustic research spans a wide range of biological questions and applications, relying on identification of target species or smaller acoustic units, such as distinct call types. However, manually identifying the signal of interest is time-intensive, error-prone, and becomes unfeasible with large data volumes. Therefore, machine-driven algorithms are increasingly applied to various bioacoustic signal identification challenges. Nevertheless, biologists still have major difficulties trying to transfer existing animal- and/or scenario-related machine learning approaches to their specific animal datasets and scientific questions. This study presents an animal-independent, open-source deep learning framework, along with a detailed user guide. Three signal identification tasks, commonly encountered in bioacoustics research, were investigated: (1) target signal vs. background noise detection, (2) species classification, and (3) call type categorization. ANIMAL-SPOT successfully segmented human-annotated target signals in data volumes representing 10 distinct animal species and 1 additional genus, resulting in a mean test accuracy of 97.9%, together with an average area under the ROC curve (AUC) of 95.9%, when predicting on unseen recordings. Moreover, an average segmentation accuracy and F1-score of 95.4% was achieved on the publicly available BirdVox-Full-Night data corpus. In addition, multi-class species and call type classification resulted in 96.6% and 92.7% accuracy on unseen test data, as well as 95.2% and 88.4% regarding previous animal-specific machine-based detection excerpts. Furthermore, an Unweighted Average Recall (UAR) of 89.3% outperformed the multi-species classification baseline system of the ComParE 2021 Primate Sub-Challenge. Besides animal independence, ANIMAL-SPOT does not rely on expert knowledge or special computing resources, thereby making deep-learning-based bioacoustic signal identification accessible to a broad audience. Nature Publishing Group UK 2022-12-19 /pmc/articles/PMC9763499/ /pubmed/36535999 http://dx.doi.org/10.1038/s41598-022-26429-y Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Bergler, Christian
Smeele, Simeon Q.
Tyndel, Stephen A.
Barnhill, Alexander
Ortiz, Sara T.
Kalan, Ammie K.
Cheng, Rachael Xi
Brinkløv, Signe
Osiecka, Anna N.
Tougaard, Jakob
Jakobsen, Freja
Wahlberg, Magnus
Nöth, Elmar
Maier, Andreas
Klump, Barbara C.
ANIMAL-SPOT enables animal-independent signal detection and classification using deep learning
title ANIMAL-SPOT enables animal-independent signal detection and classification using deep learning
title_full ANIMAL-SPOT enables animal-independent signal detection and classification using deep learning
title_fullStr ANIMAL-SPOT enables animal-independent signal detection and classification using deep learning
title_full_unstemmed ANIMAL-SPOT enables animal-independent signal detection and classification using deep learning
title_short ANIMAL-SPOT enables animal-independent signal detection and classification using deep learning
title_sort animal-spot enables animal-independent signal detection and classification using deep learning
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9763499/
https://www.ncbi.nlm.nih.gov/pubmed/36535999
http://dx.doi.org/10.1038/s41598-022-26429-y
work_keys_str_mv AT berglerchristian animalspotenablesanimalindependentsignaldetectionandclassificationusingdeeplearning
AT smeelesimeonq animalspotenablesanimalindependentsignaldetectionandclassificationusingdeeplearning
AT tyndelstephena animalspotenablesanimalindependentsignaldetectionandclassificationusingdeeplearning
AT barnhillalexander animalspotenablesanimalindependentsignaldetectionandclassificationusingdeeplearning
AT ortizsarat animalspotenablesanimalindependentsignaldetectionandclassificationusingdeeplearning
AT kalanammiek animalspotenablesanimalindependentsignaldetectionandclassificationusingdeeplearning
AT chengrachaelxi animalspotenablesanimalindependentsignaldetectionandclassificationusingdeeplearning
AT brinkløvsigne animalspotenablesanimalindependentsignaldetectionandclassificationusingdeeplearning
AT osieckaannan animalspotenablesanimalindependentsignaldetectionandclassificationusingdeeplearning
AT tougaardjakob animalspotenablesanimalindependentsignaldetectionandclassificationusingdeeplearning
AT jakobsenfreja animalspotenablesanimalindependentsignaldetectionandclassificationusingdeeplearning
AT wahlbergmagnus animalspotenablesanimalindependentsignaldetectionandclassificationusingdeeplearning
AT nothelmar animalspotenablesanimalindependentsignaldetectionandclassificationusingdeeplearning
AT maierandreas animalspotenablesanimalindependentsignaldetectionandclassificationusingdeeplearning
AT klumpbarbarac animalspotenablesanimalindependentsignaldetectionandclassificationusingdeeplearning