Cargando…
Pre‐movement event‐related potentials and multivariate pattern of EEG encode action outcome prediction
Self‐initiated movements are accompanied by an efference copy, a motor command sent from motor regions to the sensory cortices, containing a prediction of the movement's sensory outcome. Previous studies have proposed pre‐motor event‐related potentials (ERPs), including the readiness potential...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
John Wiley & Sons, Inc.
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10619393/ https://www.ncbi.nlm.nih.gov/pubmed/37792296 http://dx.doi.org/10.1002/hbm.26506 |
Sumario: | Self‐initiated movements are accompanied by an efference copy, a motor command sent from motor regions to the sensory cortices, containing a prediction of the movement's sensory outcome. Previous studies have proposed pre‐motor event‐related potentials (ERPs), including the readiness potential (RP) and its lateralized sub‐component (LRP), as potential neural markers of action feedback prediction. However, it is not known how specific these neural markers are for voluntary (active) movements as compared to involuntary (passive) movements, which produce much of the same sensory feedback (tactile, proprioceptive) but are not accompanied by an efference copy. The goal of the current study was to investigate how active and passive movements are distinguishable from premotor electroencephalography (EEG), and to examine if this change of neural activity differs when participants engage in tasks that differ in their expectation of sensory outcomes. Participants made active (self‐initiated) or passive (finger moved by device) finger movements that led to either visual or auditory stimuli (100 ms delay), or to no immediate contingency effects (control). We investigated the time window before the movement onset by measuring pre‐movement ERPs time‐locked to the button press. For RP, we observed an interaction between task and movement. This was driven by movement differences in the visual and auditory but not the control conditions. LRP conversely only showed a main effect of movement. We then used multivariate pattern analysis to decode movements (active vs. passive). The results revealed ramping decoding for all tasks from around −800 ms onwards up to an accuracy of approximately 85% at the movement. Importantly, similar to RP, we observed lower decoding accuracies for the control condition than the visual and auditory conditions, but only shortly (from −200 ms) before the button press. We also decoded visual vs. auditory conditions. Here, task is decodable for both active and passive conditions, but the active condition showed increased decoding shortly before the button press. Taken together, our results provide robust evidence that pre‐movement EEG activity may represent action‐feedback prediction in which information about the subsequent sensory outcome is encoded. |
---|