Cargando…

Meeting the Memory Challenges of Brain-Scale Network Simulation

The development of high-performance simulation software is crucial for studying the brain connectome. Using connectome data to generate neurocomputational models requires software capable of coping with models on a variety of scales: from the microscale, investigating plasticity, and dynamics of cir...

Descripción completa

Detalles Bibliográficos
Autores principales: Kunkel, Susanne, Potjans, Tobias C., Eppler, Jochen M., Plesser, Hans Ekkehard, Morrison, Abigail, Diesmann, Markus
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Research Foundation 2012
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3264885/
https://www.ncbi.nlm.nih.gov/pubmed/22291636
http://dx.doi.org/10.3389/fninf.2011.00035
_version_ 1782222021678071808
author Kunkel, Susanne
Potjans, Tobias C.
Eppler, Jochen M.
Plesser, Hans Ekkehard
Morrison, Abigail
Diesmann, Markus
author_facet Kunkel, Susanne
Potjans, Tobias C.
Eppler, Jochen M.
Plesser, Hans Ekkehard
Morrison, Abigail
Diesmann, Markus
author_sort Kunkel, Susanne
collection PubMed
description The development of high-performance simulation software is crucial for studying the brain connectome. Using connectome data to generate neurocomputational models requires software capable of coping with models on a variety of scales: from the microscale, investigating plasticity, and dynamics of circuits in local networks, to the macroscale, investigating the interactions between distinct brain regions. Prior to any serious dynamical investigation, the first task of network simulations is to check the consistency of data integrated in the connectome and constrain ranges for yet unknown parameters. Thanks to distributed computing techniques, it is possible today to routinely simulate local cortical networks of around 10(5) neurons with up to 10(9) synapses on clusters and multi-processor shared-memory machines. However, brain-scale networks are orders of magnitude larger than such local networks, in terms of numbers of neurons and synapses as well as in terms of computational load. Such networks have been investigated in individual studies, but the underlying simulation technologies have neither been described in sufficient detail to be reproducible nor made publicly available. Here, we discover that as the network model sizes approach the regime of meso- and macroscale simulations, memory consumption on individual compute nodes becomes a critical bottleneck. This is especially relevant on modern supercomputers such as the Blue Gene/P architecture where the available working memory per CPU core is rather limited. We develop a simple linear model to analyze the memory consumption of the constituent components of neuronal simulators as a function of network size and the number of cores used. This approach has multiple benefits. The model enables identification of key contributing components to memory saturation and prediction of the effects of potential improvements to code before any implementation takes place. As a consequence, development cycles can be shorter and less expensive. Applying the model to our freely available Neural Simulation Tool (NEST), we identify the software components dominant at different scales, and develop general strategies for reducing the memory consumption, in particular by using data structures that exploit the sparseness of the local representation of the network. We show that these adaptations enable our simulation software to scale up to the order of 10,000 processors and beyond. As memory consumption issues are likely to be relevant for any software dealing with complex connectome data on such architectures, our approach and our findings should be useful for researchers developing novel neuroinformatics solutions to the challenges posed by the connectome project.
format Online
Article
Text
id pubmed-3264885
institution National Center for Biotechnology Information
language English
publishDate 2012
publisher Frontiers Research Foundation
record_format MEDLINE/PubMed
spelling pubmed-32648852012-01-30 Meeting the Memory Challenges of Brain-Scale Network Simulation Kunkel, Susanne Potjans, Tobias C. Eppler, Jochen M. Plesser, Hans Ekkehard Morrison, Abigail Diesmann, Markus Front Neuroinform Neuroscience The development of high-performance simulation software is crucial for studying the brain connectome. Using connectome data to generate neurocomputational models requires software capable of coping with models on a variety of scales: from the microscale, investigating plasticity, and dynamics of circuits in local networks, to the macroscale, investigating the interactions between distinct brain regions. Prior to any serious dynamical investigation, the first task of network simulations is to check the consistency of data integrated in the connectome and constrain ranges for yet unknown parameters. Thanks to distributed computing techniques, it is possible today to routinely simulate local cortical networks of around 10(5) neurons with up to 10(9) synapses on clusters and multi-processor shared-memory machines. However, brain-scale networks are orders of magnitude larger than such local networks, in terms of numbers of neurons and synapses as well as in terms of computational load. Such networks have been investigated in individual studies, but the underlying simulation technologies have neither been described in sufficient detail to be reproducible nor made publicly available. Here, we discover that as the network model sizes approach the regime of meso- and macroscale simulations, memory consumption on individual compute nodes becomes a critical bottleneck. This is especially relevant on modern supercomputers such as the Blue Gene/P architecture where the available working memory per CPU core is rather limited. We develop a simple linear model to analyze the memory consumption of the constituent components of neuronal simulators as a function of network size and the number of cores used. This approach has multiple benefits. The model enables identification of key contributing components to memory saturation and prediction of the effects of potential improvements to code before any implementation takes place. As a consequence, development cycles can be shorter and less expensive. Applying the model to our freely available Neural Simulation Tool (NEST), we identify the software components dominant at different scales, and develop general strategies for reducing the memory consumption, in particular by using data structures that exploit the sparseness of the local representation of the network. We show that these adaptations enable our simulation software to scale up to the order of 10,000 processors and beyond. As memory consumption issues are likely to be relevant for any software dealing with complex connectome data on such architectures, our approach and our findings should be useful for researchers developing novel neuroinformatics solutions to the challenges posed by the connectome project. Frontiers Research Foundation 2012-01-24 /pmc/articles/PMC3264885/ /pubmed/22291636 http://dx.doi.org/10.3389/fninf.2011.00035 Text en Copyright © 2012 Kunkel, Potjans, Eppler, Plesser, Morrison and Diesmann. http://www.frontiersin.org/licenseagreement This is an open-access article distributed under the terms of the Creative Commons Attribution Non Commercial License, which permits non-commercial use, distribution, and reproduction in other forums, provided the original authors and source are credited.
spellingShingle Neuroscience
Kunkel, Susanne
Potjans, Tobias C.
Eppler, Jochen M.
Plesser, Hans Ekkehard
Morrison, Abigail
Diesmann, Markus
Meeting the Memory Challenges of Brain-Scale Network Simulation
title Meeting the Memory Challenges of Brain-Scale Network Simulation
title_full Meeting the Memory Challenges of Brain-Scale Network Simulation
title_fullStr Meeting the Memory Challenges of Brain-Scale Network Simulation
title_full_unstemmed Meeting the Memory Challenges of Brain-Scale Network Simulation
title_short Meeting the Memory Challenges of Brain-Scale Network Simulation
title_sort meeting the memory challenges of brain-scale network simulation
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3264885/
https://www.ncbi.nlm.nih.gov/pubmed/22291636
http://dx.doi.org/10.3389/fninf.2011.00035
work_keys_str_mv AT kunkelsusanne meetingthememorychallengesofbrainscalenetworksimulation
AT potjanstobiasc meetingthememorychallengesofbrainscalenetworksimulation
AT epplerjochenm meetingthememorychallengesofbrainscalenetworksimulation
AT plesserhansekkehard meetingthememorychallengesofbrainscalenetworksimulation
AT morrisonabigail meetingthememorychallengesofbrainscalenetworksimulation
AT diesmannmarkus meetingthememorychallengesofbrainscalenetworksimulation