Cargando…
Analyzing animal behavior via classifying each video frame using convolutional neural networks
High-throughput analysis of animal behavior requires software to analyze videos. Such software analyzes each frame individually, detecting animals’ body parts. But the image analysis rarely attempts to recognize “behavioral states”—e.g., actions or facial expressions—directly from the image instead...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group
2015
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4585819/ https://www.ncbi.nlm.nih.gov/pubmed/26394695 http://dx.doi.org/10.1038/srep14351 |
_version_ | 1782392283856896000 |
---|---|
author | Stern, Ulrich He, Ruo Yang, Chung-Hui |
author_facet | Stern, Ulrich He, Ruo Yang, Chung-Hui |
author_sort | Stern, Ulrich |
collection | PubMed |
description | High-throughput analysis of animal behavior requires software to analyze videos. Such software analyzes each frame individually, detecting animals’ body parts. But the image analysis rarely attempts to recognize “behavioral states”—e.g., actions or facial expressions—directly from the image instead of using the detected body parts. Here, we show that convolutional neural networks (CNNs)—a machine learning approach that recently became the leading technique for object recognition, human pose estimation, and human action recognition—were able to recognize directly from images whether Drosophila were “on” (standing or walking) or “off” (not in physical contact with) egg-laying substrates for each frame of our videos. We used multiple nets and image transformations to optimize accuracy for our classification task, achieving a surprisingly low error rate of just 0.072%. Classifying one of our 8 h videos took less than 3 h using a fast GPU. The approach enabled uncovering a novel egg-laying-induced behavior modification in Drosophila. Furthermore, it should be readily applicable to other behavior analysis tasks. |
format | Online Article Text |
id | pubmed-4585819 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2015 |
publisher | Nature Publishing Group |
record_format | MEDLINE/PubMed |
spelling | pubmed-45858192015-09-29 Analyzing animal behavior via classifying each video frame using convolutional neural networks Stern, Ulrich He, Ruo Yang, Chung-Hui Sci Rep Article High-throughput analysis of animal behavior requires software to analyze videos. Such software analyzes each frame individually, detecting animals’ body parts. But the image analysis rarely attempts to recognize “behavioral states”—e.g., actions or facial expressions—directly from the image instead of using the detected body parts. Here, we show that convolutional neural networks (CNNs)—a machine learning approach that recently became the leading technique for object recognition, human pose estimation, and human action recognition—were able to recognize directly from images whether Drosophila were “on” (standing or walking) or “off” (not in physical contact with) egg-laying substrates for each frame of our videos. We used multiple nets and image transformations to optimize accuracy for our classification task, achieving a surprisingly low error rate of just 0.072%. Classifying one of our 8 h videos took less than 3 h using a fast GPU. The approach enabled uncovering a novel egg-laying-induced behavior modification in Drosophila. Furthermore, it should be readily applicable to other behavior analysis tasks. Nature Publishing Group 2015-09-23 /pmc/articles/PMC4585819/ /pubmed/26394695 http://dx.doi.org/10.1038/srep14351 Text en Copyright © 2015, Macmillan Publishers Limited http://creativecommons.org/licenses/by/4.0/ This work is licensed under a Creative Commons Attribution 4.0 International License. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in the credit line; if the material is not included under the Creative Commons license, users will need to obtain permission from the license holder to reproduce the material. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ |
spellingShingle | Article Stern, Ulrich He, Ruo Yang, Chung-Hui Analyzing animal behavior via classifying each video frame using convolutional neural networks |
title | Analyzing animal behavior via classifying each video frame using convolutional neural networks |
title_full | Analyzing animal behavior via classifying each video frame using convolutional neural networks |
title_fullStr | Analyzing animal behavior via classifying each video frame using convolutional neural networks |
title_full_unstemmed | Analyzing animal behavior via classifying each video frame using convolutional neural networks |
title_short | Analyzing animal behavior via classifying each video frame using convolutional neural networks |
title_sort | analyzing animal behavior via classifying each video frame using convolutional neural networks |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4585819/ https://www.ncbi.nlm.nih.gov/pubmed/26394695 http://dx.doi.org/10.1038/srep14351 |
work_keys_str_mv | AT sternulrich analyzinganimalbehaviorviaclassifyingeachvideoframeusingconvolutionalneuralnetworks AT heruo analyzinganimalbehaviorviaclassifyingeachvideoframeusingconvolutionalneuralnetworks AT yangchunghui analyzinganimalbehaviorviaclassifyingeachvideoframeusingconvolutionalneuralnetworks |