Cargando…

Fast deep neural correspondence for tracking and identifying neurons in C. elegans using semi-synthetic training

We present an automated method to track and identify neurons in C. elegans, called ‘fast Deep Neural Correspondence’ or fDNC, based on the transformer network architecture. The model is trained once on empirically derived semi-synthetic data and then predicts neural correspondence across held-out re...

Descripción completa

Detalles Bibliográficos
Autores principales: Yu, Xinwei, Creamer, Matthew S, Randi, Francesco, Sharma, Anuj K, Linderman, Scott W, Leifer, Andrew M
Formato: Online Artículo Texto
Lenguaje:English
Publicado: eLife Sciences Publications, Ltd 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8367385/
https://www.ncbi.nlm.nih.gov/pubmed/34259623
http://dx.doi.org/10.7554/eLife.66410
_version_ 1783739046751633408
author Yu, Xinwei
Creamer, Matthew S
Randi, Francesco
Sharma, Anuj K
Linderman, Scott W
Leifer, Andrew M
author_facet Yu, Xinwei
Creamer, Matthew S
Randi, Francesco
Sharma, Anuj K
Linderman, Scott W
Leifer, Andrew M
author_sort Yu, Xinwei
collection PubMed
description We present an automated method to track and identify neurons in C. elegans, called ‘fast Deep Neural Correspondence’ or fDNC, based on the transformer network architecture. The model is trained once on empirically derived semi-synthetic data and then predicts neural correspondence across held-out real animals. The same pre-trained model both tracks neurons across time and identifies corresponding neurons across individuals. Performance is evaluated against hand-annotated datasets, including NeuroPAL (Yemini et al., 2021). Using only position information, the method achieves 79.1% accuracy at tracking neurons within an individual and 64.1% accuracy at identifying neurons across individuals. Accuracy at identifying neurons across individuals is even higher (78.2%) when the model is applied to a dataset published by another group (Chaudhary et al., 2021). Accuracy reaches 74.7% on our dataset when using color information from NeuroPAL. Unlike previous methods, fDNC does not require straightening or transforming the animal into a canonical coordinate system. The method is fast and predicts correspondence in 10 ms making it suitable for future real-time applications.
format Online
Article
Text
id pubmed-8367385
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher eLife Sciences Publications, Ltd
record_format MEDLINE/PubMed
spelling pubmed-83673852021-08-18 Fast deep neural correspondence for tracking and identifying neurons in C. elegans using semi-synthetic training Yu, Xinwei Creamer, Matthew S Randi, Francesco Sharma, Anuj K Linderman, Scott W Leifer, Andrew M eLife Neuroscience We present an automated method to track and identify neurons in C. elegans, called ‘fast Deep Neural Correspondence’ or fDNC, based on the transformer network architecture. The model is trained once on empirically derived semi-synthetic data and then predicts neural correspondence across held-out real animals. The same pre-trained model both tracks neurons across time and identifies corresponding neurons across individuals. Performance is evaluated against hand-annotated datasets, including NeuroPAL (Yemini et al., 2021). Using only position information, the method achieves 79.1% accuracy at tracking neurons within an individual and 64.1% accuracy at identifying neurons across individuals. Accuracy at identifying neurons across individuals is even higher (78.2%) when the model is applied to a dataset published by another group (Chaudhary et al., 2021). Accuracy reaches 74.7% on our dataset when using color information from NeuroPAL. Unlike previous methods, fDNC does not require straightening or transforming the animal into a canonical coordinate system. The method is fast and predicts correspondence in 10 ms making it suitable for future real-time applications. eLife Sciences Publications, Ltd 2021-07-14 /pmc/articles/PMC8367385/ /pubmed/34259623 http://dx.doi.org/10.7554/eLife.66410 Text en © 2021, Yu et al https://creativecommons.org/licenses/by/4.0/This article is distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use and redistribution provided that the original author and source are credited.
spellingShingle Neuroscience
Yu, Xinwei
Creamer, Matthew S
Randi, Francesco
Sharma, Anuj K
Linderman, Scott W
Leifer, Andrew M
Fast deep neural correspondence for tracking and identifying neurons in C. elegans using semi-synthetic training
title Fast deep neural correspondence for tracking and identifying neurons in C. elegans using semi-synthetic training
title_full Fast deep neural correspondence for tracking and identifying neurons in C. elegans using semi-synthetic training
title_fullStr Fast deep neural correspondence for tracking and identifying neurons in C. elegans using semi-synthetic training
title_full_unstemmed Fast deep neural correspondence for tracking and identifying neurons in C. elegans using semi-synthetic training
title_short Fast deep neural correspondence for tracking and identifying neurons in C. elegans using semi-synthetic training
title_sort fast deep neural correspondence for tracking and identifying neurons in c. elegans using semi-synthetic training
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8367385/
https://www.ncbi.nlm.nih.gov/pubmed/34259623
http://dx.doi.org/10.7554/eLife.66410
work_keys_str_mv AT yuxinwei fastdeepneuralcorrespondencefortrackingandidentifyingneuronsincelegansusingsemisynthetictraining
AT creamermatthews fastdeepneuralcorrespondencefortrackingandidentifyingneuronsincelegansusingsemisynthetictraining
AT randifrancesco fastdeepneuralcorrespondencefortrackingandidentifyingneuronsincelegansusingsemisynthetictraining
AT sharmaanujk fastdeepneuralcorrespondencefortrackingandidentifyingneuronsincelegansusingsemisynthetictraining
AT lindermanscottw fastdeepneuralcorrespondencefortrackingandidentifyingneuronsincelegansusingsemisynthetictraining
AT leiferandrewm fastdeepneuralcorrespondencefortrackingandidentifyingneuronsincelegansusingsemisynthetictraining