Cargando…

DeepLabStream enables closed-loop behavioral experiments using deep learning-based markerless, real-time posture detection

In general, animal behavior can be described as the neuronal-driven sequence of reoccurring postures through time. Most of the available current technologies focus on offline pose estimation with high spatiotemporal resolution. However, to correlate behavior with neuronal activity it is often necess...

Descripción completa

Detalles Bibliográficos
Autores principales: Schweihoff, Jens F., Loshakov, Matvey, Pavlova, Irina, Kück, Laura, Ewell, Laura A., Schwarz, Martin K.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7846585/
https://www.ncbi.nlm.nih.gov/pubmed/33514883
http://dx.doi.org/10.1038/s42003-021-01654-9
Descripción
Sumario:In general, animal behavior can be described as the neuronal-driven sequence of reoccurring postures through time. Most of the available current technologies focus on offline pose estimation with high spatiotemporal resolution. However, to correlate behavior with neuronal activity it is often necessary to detect and react online to behavioral expressions. Here we present DeepLabStream, a versatile closed-loop tool providing real-time pose estimation to deliver posture dependent stimulations. DeepLabStream has a temporal resolution in the millisecond range, can utilize different input, as well as output devices and can be tailored to multiple experimental designs. We employ DeepLabStream to semi-autonomously run a second-order olfactory conditioning task with freely moving mice and optogenetically label neuronal ensembles active during specific head directions.