Cargando…

Ensembles of data-efficient vision transformers as a new paradigm for automated classification in ecology

Monitoring biodiversity is paramount to manage and protect natural resources. Collecting images of organisms over large temporal or spatial scales is a promising practice to monitor the biodiversity of natural ecosystems, providing large amounts of data with minimal interference with the environment...

Descripción completa

Detalles Bibliográficos
Autores principales: Kyathanahally, S. P., Hardeman, T., Reyes, M., Merz, E., Bulas, T., Brun, P., Pomati, F., Baity-Jesi, M.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9633651/
https://www.ncbi.nlm.nih.gov/pubmed/36329061
http://dx.doi.org/10.1038/s41598-022-21910-0
_version_ 1784824281936429056
author Kyathanahally, S. P.
Hardeman, T.
Reyes, M.
Merz, E.
Bulas, T.
Brun, P.
Pomati, F.
Baity-Jesi, M.
author_facet Kyathanahally, S. P.
Hardeman, T.
Reyes, M.
Merz, E.
Bulas, T.
Brun, P.
Pomati, F.
Baity-Jesi, M.
author_sort Kyathanahally, S. P.
collection PubMed
description Monitoring biodiversity is paramount to manage and protect natural resources. Collecting images of organisms over large temporal or spatial scales is a promising practice to monitor the biodiversity of natural ecosystems, providing large amounts of data with minimal interference with the environment. Deep learning models are currently used to automate classification of organisms into taxonomic units. However, imprecision in these classifiers introduces a measurement noise that is difficult to control and can significantly hinder the analysis and interpretation of data. We overcome this limitation through ensembles of Data-efficient image Transformers (DeiTs), which we show can reach state-of-the-art (SOTA) performances without hyperparameter tuning, if one follows a simple fixed training schedule. We validate our results on ten ecological imaging datasets of diverse origin, ranging from plankton to birds. The performances of our EDeiTs are always comparable with the previous SOTA, even beating it in four out of ten cases. We argue that these ensemble of DeiTs perform better not because of superior single-model performances but rather due to smaller overlaps in the predictions by independent models and lower top-1 probabilities, which increases the benefit of ensembling.
format Online
Article
Text
id pubmed-9633651
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-96336512022-11-05 Ensembles of data-efficient vision transformers as a new paradigm for automated classification in ecology Kyathanahally, S. P. Hardeman, T. Reyes, M. Merz, E. Bulas, T. Brun, P. Pomati, F. Baity-Jesi, M. Sci Rep Article Monitoring biodiversity is paramount to manage and protect natural resources. Collecting images of organisms over large temporal or spatial scales is a promising practice to monitor the biodiversity of natural ecosystems, providing large amounts of data with minimal interference with the environment. Deep learning models are currently used to automate classification of organisms into taxonomic units. However, imprecision in these classifiers introduces a measurement noise that is difficult to control and can significantly hinder the analysis and interpretation of data. We overcome this limitation through ensembles of Data-efficient image Transformers (DeiTs), which we show can reach state-of-the-art (SOTA) performances without hyperparameter tuning, if one follows a simple fixed training schedule. We validate our results on ten ecological imaging datasets of diverse origin, ranging from plankton to birds. The performances of our EDeiTs are always comparable with the previous SOTA, even beating it in four out of ten cases. We argue that these ensemble of DeiTs perform better not because of superior single-model performances but rather due to smaller overlaps in the predictions by independent models and lower top-1 probabilities, which increases the benefit of ensembling. Nature Publishing Group UK 2022-11-03 /pmc/articles/PMC9633651/ /pubmed/36329061 http://dx.doi.org/10.1038/s41598-022-21910-0 Text en © The Author(s) 2022, corrected publication 2023 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Kyathanahally, S. P.
Hardeman, T.
Reyes, M.
Merz, E.
Bulas, T.
Brun, P.
Pomati, F.
Baity-Jesi, M.
Ensembles of data-efficient vision transformers as a new paradigm for automated classification in ecology
title Ensembles of data-efficient vision transformers as a new paradigm for automated classification in ecology
title_full Ensembles of data-efficient vision transformers as a new paradigm for automated classification in ecology
title_fullStr Ensembles of data-efficient vision transformers as a new paradigm for automated classification in ecology
title_full_unstemmed Ensembles of data-efficient vision transformers as a new paradigm for automated classification in ecology
title_short Ensembles of data-efficient vision transformers as a new paradigm for automated classification in ecology
title_sort ensembles of data-efficient vision transformers as a new paradigm for automated classification in ecology
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9633651/
https://www.ncbi.nlm.nih.gov/pubmed/36329061
http://dx.doi.org/10.1038/s41598-022-21910-0
work_keys_str_mv AT kyathanahallysp ensemblesofdataefficientvisiontransformersasanewparadigmforautomatedclassificationinecology
AT hardemant ensemblesofdataefficientvisiontransformersasanewparadigmforautomatedclassificationinecology
AT reyesm ensemblesofdataefficientvisiontransformersasanewparadigmforautomatedclassificationinecology
AT merze ensemblesofdataefficientvisiontransformersasanewparadigmforautomatedclassificationinecology
AT bulast ensemblesofdataefficientvisiontransformersasanewparadigmforautomatedclassificationinecology
AT brunp ensemblesofdataefficientvisiontransformersasanewparadigmforautomatedclassificationinecology
AT pomatif ensemblesofdataefficientvisiontransformersasanewparadigmforautomatedclassificationinecology
AT baityjesim ensemblesofdataefficientvisiontransformersasanewparadigmforautomatedclassificationinecology