Cargando…
Marker-less tracking system for multiple mice using Mask R-CNN
Although the appropriate evaluation of mouse behavior is crucial in pharmacological research, most current methods focus on single mouse behavior under light conditions, owing to the limitations of human observation and experimental tools. In this study, we aimed to develop a novel marker-less track...
Autores principales: | , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9853548/ https://www.ncbi.nlm.nih.gov/pubmed/36688129 http://dx.doi.org/10.3389/fnbeh.2022.1086242 |
_version_ | 1784872926910087168 |
---|---|
author | Sakamoto, Naoaki Kakeno, Hitoshi Ozaki, Noriko Miyazaki, Yusuke Kobayashi, Koji Murata, Takahisa |
author_facet | Sakamoto, Naoaki Kakeno, Hitoshi Ozaki, Noriko Miyazaki, Yusuke Kobayashi, Koji Murata, Takahisa |
author_sort | Sakamoto, Naoaki |
collection | PubMed |
description | Although the appropriate evaluation of mouse behavior is crucial in pharmacological research, most current methods focus on single mouse behavior under light conditions, owing to the limitations of human observation and experimental tools. In this study, we aimed to develop a novel marker-less tracking method for multiple mice with top-view videos using deep-learning-based techniques. The following stepwise method was introduced: (i) detection of mouse contours, (ii) assignment of identifiers (IDs) to each mouse, and (iii) correction of mis-predictions. The behavior of C57BL/6 mice was recorded in an open-field arena, and the mouse contours were manually annotated for hundreds of frame images. Then, we trained the mask regional convolutional neural network (Mask R-CNN) with all annotated images. The mouse contours predicted by the trained model in each frame were assigned to IDs by calculating the similarities of every mouse pair between frames. After assigning IDs, correction steps were applied to remove the predictive errors semi-automatically. The established method could accurately predict two to four mice for first-look videos recorded under light conditions. The method could also be applied to videos recorded under dark conditions, extending our ability to accurately observe and analyze the sociality of nocturnal mice. This technology would enable a new approach to understand mouse sociality and advance the pharmacological research. |
format | Online Article Text |
id | pubmed-9853548 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-98535482023-01-21 Marker-less tracking system for multiple mice using Mask R-CNN Sakamoto, Naoaki Kakeno, Hitoshi Ozaki, Noriko Miyazaki, Yusuke Kobayashi, Koji Murata, Takahisa Front Behav Neurosci Neuroscience Although the appropriate evaluation of mouse behavior is crucial in pharmacological research, most current methods focus on single mouse behavior under light conditions, owing to the limitations of human observation and experimental tools. In this study, we aimed to develop a novel marker-less tracking method for multiple mice with top-view videos using deep-learning-based techniques. The following stepwise method was introduced: (i) detection of mouse contours, (ii) assignment of identifiers (IDs) to each mouse, and (iii) correction of mis-predictions. The behavior of C57BL/6 mice was recorded in an open-field arena, and the mouse contours were manually annotated for hundreds of frame images. Then, we trained the mask regional convolutional neural network (Mask R-CNN) with all annotated images. The mouse contours predicted by the trained model in each frame were assigned to IDs by calculating the similarities of every mouse pair between frames. After assigning IDs, correction steps were applied to remove the predictive errors semi-automatically. The established method could accurately predict two to four mice for first-look videos recorded under light conditions. The method could also be applied to videos recorded under dark conditions, extending our ability to accurately observe and analyze the sociality of nocturnal mice. This technology would enable a new approach to understand mouse sociality and advance the pharmacological research. Frontiers Media S.A. 2023-01-06 /pmc/articles/PMC9853548/ /pubmed/36688129 http://dx.doi.org/10.3389/fnbeh.2022.1086242 Text en Copyright © 2023 Sakamoto, Kakeno, Ozaki, Miyazaki, Kobayashi and Murata. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Neuroscience Sakamoto, Naoaki Kakeno, Hitoshi Ozaki, Noriko Miyazaki, Yusuke Kobayashi, Koji Murata, Takahisa Marker-less tracking system for multiple mice using Mask R-CNN |
title | Marker-less tracking system for multiple mice using Mask R-CNN |
title_full | Marker-less tracking system for multiple mice using Mask R-CNN |
title_fullStr | Marker-less tracking system for multiple mice using Mask R-CNN |
title_full_unstemmed | Marker-less tracking system for multiple mice using Mask R-CNN |
title_short | Marker-less tracking system for multiple mice using Mask R-CNN |
title_sort | marker-less tracking system for multiple mice using mask r-cnn |
topic | Neuroscience |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9853548/ https://www.ncbi.nlm.nih.gov/pubmed/36688129 http://dx.doi.org/10.3389/fnbeh.2022.1086242 |
work_keys_str_mv | AT sakamotonaoaki markerlesstrackingsystemformultiplemiceusingmaskrcnn AT kakenohitoshi markerlesstrackingsystemformultiplemiceusingmaskrcnn AT ozakinoriko markerlesstrackingsystemformultiplemiceusingmaskrcnn AT miyazakiyusuke markerlesstrackingsystemformultiplemiceusingmaskrcnn AT kobayashikoji markerlesstrackingsystemformultiplemiceusingmaskrcnn AT muratatakahisa markerlesstrackingsystemformultiplemiceusingmaskrcnn |