Cargando…
On separating long- and short-term memories in hyperdimensional computing
Operations on high-dimensional, fixed-width vectors can be used to distribute information from several vectors over a single vector of the same width. For example, a set of key-value pairs can be encoded into a single vector with multiplication and addition of the corresponding key and value vectors...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9869149/ https://www.ncbi.nlm.nih.gov/pubmed/36699525 http://dx.doi.org/10.3389/fnins.2022.867568 |
_version_ | 1784876706404761600 |
---|---|
author | Teeters, Jeffrey L. Kleyko, Denis Kanerva, Pentti Olshausen, Bruno A. |
author_facet | Teeters, Jeffrey L. Kleyko, Denis Kanerva, Pentti Olshausen, Bruno A. |
author_sort | Teeters, Jeffrey L. |
collection | PubMed |
description | Operations on high-dimensional, fixed-width vectors can be used to distribute information from several vectors over a single vector of the same width. For example, a set of key-value pairs can be encoded into a single vector with multiplication and addition of the corresponding key and value vectors: the keys are bound to their values with component-wise multiplication, and the key-value pairs are combined into a single superposition vector with component-wise addition. The superposition vector is, thus, a memory which can then be queried for the value of any of the keys, but the result of the query is approximate. The exact vector is retrieved from a codebook (a.k.a. item memory), which contains vectors defined in the system. To perform these operations, the item memory vectors and the superposition vector must be the same width. Increasing the capacity of the memory requires increasing the width of the superposition and item memory vectors. In this article, we demonstrate that in a regime where many (e.g., 1,000 or more) key-value pairs are stored, an associative memory which maps key vectors to value vectors requires less memory and less computing to obtain the same reliability of storage as a superposition vector. These advantages are obtained because the number of storage locations in an associate memory can be increased without increasing the width of the vectors in the item memory. An associative memory would not replace a superposition vector as a medium of storage, but could augment it, because data recalled from an associative memory could be used in algorithms that use a superposition vector. This would be analogous to how human working memory (which stores about seven items) uses information recalled from long-term memory (which is much larger than the working memory). We demonstrate the advantages of an associative memory experimentally using the storage of large finite-state automata, which could model the storage and recall of state-dependent behavior by brains. |
format | Online Article Text |
id | pubmed-9869149 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-98691492023-01-24 On separating long- and short-term memories in hyperdimensional computing Teeters, Jeffrey L. Kleyko, Denis Kanerva, Pentti Olshausen, Bruno A. Front Neurosci Neuroscience Operations on high-dimensional, fixed-width vectors can be used to distribute information from several vectors over a single vector of the same width. For example, a set of key-value pairs can be encoded into a single vector with multiplication and addition of the corresponding key and value vectors: the keys are bound to their values with component-wise multiplication, and the key-value pairs are combined into a single superposition vector with component-wise addition. The superposition vector is, thus, a memory which can then be queried for the value of any of the keys, but the result of the query is approximate. The exact vector is retrieved from a codebook (a.k.a. item memory), which contains vectors defined in the system. To perform these operations, the item memory vectors and the superposition vector must be the same width. Increasing the capacity of the memory requires increasing the width of the superposition and item memory vectors. In this article, we demonstrate that in a regime where many (e.g., 1,000 or more) key-value pairs are stored, an associative memory which maps key vectors to value vectors requires less memory and less computing to obtain the same reliability of storage as a superposition vector. These advantages are obtained because the number of storage locations in an associate memory can be increased without increasing the width of the vectors in the item memory. An associative memory would not replace a superposition vector as a medium of storage, but could augment it, because data recalled from an associative memory could be used in algorithms that use a superposition vector. This would be analogous to how human working memory (which stores about seven items) uses information recalled from long-term memory (which is much larger than the working memory). We demonstrate the advantages of an associative memory experimentally using the storage of large finite-state automata, which could model the storage and recall of state-dependent behavior by brains. Frontiers Media S.A. 2023-01-09 /pmc/articles/PMC9869149/ /pubmed/36699525 http://dx.doi.org/10.3389/fnins.2022.867568 Text en Copyright © 2023 Teeters, Kleyko, Kanerva and Olshausen. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Neuroscience Teeters, Jeffrey L. Kleyko, Denis Kanerva, Pentti Olshausen, Bruno A. On separating long- and short-term memories in hyperdimensional computing |
title | On separating long- and short-term memories in hyperdimensional computing |
title_full | On separating long- and short-term memories in hyperdimensional computing |
title_fullStr | On separating long- and short-term memories in hyperdimensional computing |
title_full_unstemmed | On separating long- and short-term memories in hyperdimensional computing |
title_short | On separating long- and short-term memories in hyperdimensional computing |
title_sort | on separating long- and short-term memories in hyperdimensional computing |
topic | Neuroscience |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9869149/ https://www.ncbi.nlm.nih.gov/pubmed/36699525 http://dx.doi.org/10.3389/fnins.2022.867568 |
work_keys_str_mv | AT teetersjeffreyl onseparatinglongandshorttermmemoriesinhyperdimensionalcomputing AT kleykodenis onseparatinglongandshorttermmemoriesinhyperdimensionalcomputing AT kanervapentti onseparatinglongandshorttermmemoriesinhyperdimensionalcomputing AT olshausenbrunoa onseparatinglongandshorttermmemoriesinhyperdimensionalcomputing |