Cargando…

Neuronal diversity can improve machine learning for physics and beyond

Diversity conveys advantages in nature, yet homogeneous neurons typically comprise the layers of artificial neural networks. Here we construct neural networks from neurons that learn their own activation functions, quickly diversify, and subsequently outperform their homogeneous counterparts on imag...

Descripción completa

Detalles Bibliográficos
Autores principales: Choudhary, Anshul, Radhakrishnan, Anil, Lindner, John F., Sinha, Sudeshna, Ditto, William L.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10460398/
https://www.ncbi.nlm.nih.gov/pubmed/37634029
http://dx.doi.org/10.1038/s41598-023-40766-6
Descripción
Sumario:Diversity conveys advantages in nature, yet homogeneous neurons typically comprise the layers of artificial neural networks. Here we construct neural networks from neurons that learn their own activation functions, quickly diversify, and subsequently outperform their homogeneous counterparts on image classification and nonlinear regression tasks. Sub-networks instantiate the neurons, which meta-learn especially efficient sets of nonlinear responses. Examples include conventional neural networks classifying digits and forecasting a van der Pol oscillator and physics-informed Hamiltonian neural networks learning Hénon–Heiles stellar orbits and the swing of a video recorded pendulum clock. Such learned diversity provides examples of dynamical systems selecting diversity over uniformity and elucidates the role of diversity in natural and artificial systems.