Cargando…

Do Brain Networks Evolve by Maximizing Their Information Flow Capacity?

We propose a working hypothesis supported by numerical simulations that brain networks evolve based on the principle of the maximization of their internal information flow capacity. We find that synchronous behavior and capacity of information flow of the evolved networks reproduce well the same beh...

Descripción completa

Detalles Bibliográficos
Autores principales: Antonopoulos, Chris G., Srivastava, Shambhavi, Pinto, Sandro E. de S., Baptista, Murilo S.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2015
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4552863/
https://www.ncbi.nlm.nih.gov/pubmed/26317592
http://dx.doi.org/10.1371/journal.pcbi.1004372
Descripción
Sumario:We propose a working hypothesis supported by numerical simulations that brain networks evolve based on the principle of the maximization of their internal information flow capacity. We find that synchronous behavior and capacity of information flow of the evolved networks reproduce well the same behaviors observed in the brain dynamical networks of Caenorhabditis elegans and humans, networks of Hindmarsh-Rose neurons with graphs given by these brain networks. We make a strong case to verify our hypothesis by showing that the neural networks with the closest graph distance to the brain networks of Caenorhabditis elegans and humans are the Hindmarsh-Rose neural networks evolved with coupling strengths that maximize information flow capacity. Surprisingly, we find that global neural synchronization levels decrease during brain evolution, reflecting on an underlying global no Hebbian-like evolution process, which is driven by no Hebbian-like learning behaviors for some of the clusters during evolution, and Hebbian-like learning rules for clusters where neurons increase their synchronization.