Cargando…
A neuromechanistic model for rhythmic beat generation
When listening to music, humans can easily identify and move to the beat. Numerous experimental studies have identified brain regions that may be involved with beat perception and representation. Several theoretical and algorithmic approaches have been proposed to account for this ability. Related t...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2019
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6508617/ https://www.ncbi.nlm.nih.gov/pubmed/31071078 http://dx.doi.org/10.1371/journal.pcbi.1006450 |
_version_ | 1783417100025462784 |
---|---|
author | Bose, Amitabha Byrne, Áine Rinzel, John |
author_facet | Bose, Amitabha Byrne, Áine Rinzel, John |
author_sort | Bose, Amitabha |
collection | PubMed |
description | When listening to music, humans can easily identify and move to the beat. Numerous experimental studies have identified brain regions that may be involved with beat perception and representation. Several theoretical and algorithmic approaches have been proposed to account for this ability. Related to, but different from the issue of how we perceive a beat, is the question of how we learn to generate and hold a beat. In this paper, we introduce a neuronal framework for a beat generator that is capable of learning isochronous rhythms over a range of frequencies that are relevant to music and speech. Our approach combines ideas from error-correction and entrainment models to investigate the dynamics of how a biophysically-based neuronal network model synchronizes its period and phase to match that of an external stimulus. The model makes novel use of on-going faster gamma rhythms to form a set of discrete clocks that provide estimates, but not exact information, of how well the beat generator spike times match those of a stimulus sequence. The beat generator is endowed with plasticity allowing it to quickly learn and thereby adjust its spike times to achieve synchronization. Our model makes generalizable predictions about the existence of asymmetries in the synchronization process, as well as specific predictions about resynchronization times after changes in stimulus tempo or phase. Analysis of the model demonstrates that accurate rhythmic time keeping can be achieved over a range of frequencies relevant to music, in a manner that is robust to changes in parameters and to the presence of noise. |
format | Online Article Text |
id | pubmed-6508617 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2019 |
publisher | Public Library of Science |
record_format | MEDLINE/PubMed |
spelling | pubmed-65086172019-05-23 A neuromechanistic model for rhythmic beat generation Bose, Amitabha Byrne, Áine Rinzel, John PLoS Comput Biol Research Article When listening to music, humans can easily identify and move to the beat. Numerous experimental studies have identified brain regions that may be involved with beat perception and representation. Several theoretical and algorithmic approaches have been proposed to account for this ability. Related to, but different from the issue of how we perceive a beat, is the question of how we learn to generate and hold a beat. In this paper, we introduce a neuronal framework for a beat generator that is capable of learning isochronous rhythms over a range of frequencies that are relevant to music and speech. Our approach combines ideas from error-correction and entrainment models to investigate the dynamics of how a biophysically-based neuronal network model synchronizes its period and phase to match that of an external stimulus. The model makes novel use of on-going faster gamma rhythms to form a set of discrete clocks that provide estimates, but not exact information, of how well the beat generator spike times match those of a stimulus sequence. The beat generator is endowed with plasticity allowing it to quickly learn and thereby adjust its spike times to achieve synchronization. Our model makes generalizable predictions about the existence of asymmetries in the synchronization process, as well as specific predictions about resynchronization times after changes in stimulus tempo or phase. Analysis of the model demonstrates that accurate rhythmic time keeping can be achieved over a range of frequencies relevant to music, in a manner that is robust to changes in parameters and to the presence of noise. Public Library of Science 2019-05-09 /pmc/articles/PMC6508617/ /pubmed/31071078 http://dx.doi.org/10.1371/journal.pcbi.1006450 Text en © 2019 Bose et al http://creativecommons.org/licenses/by/4.0/ This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. |
spellingShingle | Research Article Bose, Amitabha Byrne, Áine Rinzel, John A neuromechanistic model for rhythmic beat generation |
title | A neuromechanistic model for rhythmic beat generation |
title_full | A neuromechanistic model for rhythmic beat generation |
title_fullStr | A neuromechanistic model for rhythmic beat generation |
title_full_unstemmed | A neuromechanistic model for rhythmic beat generation |
title_short | A neuromechanistic model for rhythmic beat generation |
title_sort | neuromechanistic model for rhythmic beat generation |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6508617/ https://www.ncbi.nlm.nih.gov/pubmed/31071078 http://dx.doi.org/10.1371/journal.pcbi.1006450 |
work_keys_str_mv | AT boseamitabha aneuromechanisticmodelforrhythmicbeatgeneration AT byrneaine aneuromechanisticmodelforrhythmicbeatgeneration AT rinzeljohn aneuromechanisticmodelforrhythmicbeatgeneration AT boseamitabha neuromechanisticmodelforrhythmicbeatgeneration AT byrneaine neuromechanisticmodelforrhythmicbeatgeneration AT rinzeljohn neuromechanisticmodelforrhythmicbeatgeneration |