Cargando…
Learning Numerosity Representations with Transformers: Number Generation Tasks and Out-of-Distribution Generalization
One of the most rapidly advancing areas of deep learning research aims at creating models that learn to disentangle the latent factors of variation from a data distribution. However, modeling joint probability mass functions is usually prohibitive, which motivates the use of conditional models assum...
Autores principales: | Boccato, Tommaso, Testolin, Alberto, Zorzi, Marco |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8303966/ https://www.ncbi.nlm.nih.gov/pubmed/34356398 http://dx.doi.org/10.3390/e23070857 |
Ejemplares similares
-
Electrophysiological Signatures of Numerosity Encoding in a Delayed Match-to-Sample Task
por: Fu, Wanlu, et al.
Publicado: (2022) -
An emergentist perspective on the origin of number sense
por: Zorzi, Marco, et al.
Publicado: (2018) -
Number and Continuous Magnitude Processing Depends on Task Goals and Numerosity Ratio
por: Leibovich-Raveh, Tali, et al.
Publicado: (2018) -
Motion along the mental number line reveals shared representations for numerosity and space
por: Schwiedrzik, Caspar M, et al.
Publicado: (2016) -
Representation of numerosity in posterior parietal cortex
por: Roitman, Jamie D., et al.
Publicado: (2012)