Cargando…

Deep physical neural networks trained with backpropagation

Deep-learning models have become pervasive tools in science and engineering. However, their energy requirements now increasingly limit their scalability(1). Deep-learning accelerators(2–9) aim to perform deep learning energy-efficiently, usually targeting the inference phase and often by exploiting...

Descripción completa

Detalles Bibliográficos
Autores principales: Wright, Logan G., Onodera, Tatsuhiro, Stein, Martin M., Wang, Tianyu, Schachter, Darren T., Hu, Zoey, McMahon, Peter L.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8791835/
https://www.ncbi.nlm.nih.gov/pubmed/35082422
http://dx.doi.org/10.1038/s41586-021-04223-6
_version_ 1784640271989866496
author Wright, Logan G.
Onodera, Tatsuhiro
Stein, Martin M.
Wang, Tianyu
Schachter, Darren T.
Hu, Zoey
McMahon, Peter L.
author_facet Wright, Logan G.
Onodera, Tatsuhiro
Stein, Martin M.
Wang, Tianyu
Schachter, Darren T.
Hu, Zoey
McMahon, Peter L.
author_sort Wright, Logan G.
collection PubMed
description Deep-learning models have become pervasive tools in science and engineering. However, their energy requirements now increasingly limit their scalability(1). Deep-learning accelerators(2–9) aim to perform deep learning energy-efficiently, usually targeting the inference phase and often by exploiting physical substrates beyond conventional electronics. Approaches so far(10–22) have been unable to apply the backpropagation algorithm to train unconventional novel hardware in situ. The advantages of backpropagation have made it the de facto training method for large-scale neural networks, so this deficiency constitutes a major impediment. Here we introduce a hybrid in situ–in silico algorithm, called physics-aware training, that applies backpropagation to train controllable physical systems. Just as deep learning realizes computations with deep neural networks made from layers of mathematical functions, our approach allows us to train deep physical neural networks made from layers of controllable physical systems, even when the physical layers lack any mathematical isomorphism to conventional artificial neural network layers. To demonstrate the universality of our approach, we train diverse physical neural networks based on optics, mechanics and electronics to experimentally perform audio and image classification tasks. Physics-aware training combines the scalability of backpropagation with the automatic mitigation of imperfections and noise achievable with in situ algorithms. Physical neural networks have the potential to perform machine learning faster and more energy-efficiently than conventional electronic processors and, more broadly, can endow physical systems with automatically designed physical functionalities, for example, for robotics(23–26), materials(27–29) and smart sensors(30–32).
format Online
Article
Text
id pubmed-8791835
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-87918352022-02-07 Deep physical neural networks trained with backpropagation Wright, Logan G. Onodera, Tatsuhiro Stein, Martin M. Wang, Tianyu Schachter, Darren T. Hu, Zoey McMahon, Peter L. Nature Article Deep-learning models have become pervasive tools in science and engineering. However, their energy requirements now increasingly limit their scalability(1). Deep-learning accelerators(2–9) aim to perform deep learning energy-efficiently, usually targeting the inference phase and often by exploiting physical substrates beyond conventional electronics. Approaches so far(10–22) have been unable to apply the backpropagation algorithm to train unconventional novel hardware in situ. The advantages of backpropagation have made it the de facto training method for large-scale neural networks, so this deficiency constitutes a major impediment. Here we introduce a hybrid in situ–in silico algorithm, called physics-aware training, that applies backpropagation to train controllable physical systems. Just as deep learning realizes computations with deep neural networks made from layers of mathematical functions, our approach allows us to train deep physical neural networks made from layers of controllable physical systems, even when the physical layers lack any mathematical isomorphism to conventional artificial neural network layers. To demonstrate the universality of our approach, we train diverse physical neural networks based on optics, mechanics and electronics to experimentally perform audio and image classification tasks. Physics-aware training combines the scalability of backpropagation with the automatic mitigation of imperfections and noise achievable with in situ algorithms. Physical neural networks have the potential to perform machine learning faster and more energy-efficiently than conventional electronic processors and, more broadly, can endow physical systems with automatically designed physical functionalities, for example, for robotics(23–26), materials(27–29) and smart sensors(30–32). Nature Publishing Group UK 2022-01-26 2022 /pmc/articles/PMC8791835/ /pubmed/35082422 http://dx.doi.org/10.1038/s41586-021-04223-6 Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Wright, Logan G.
Onodera, Tatsuhiro
Stein, Martin M.
Wang, Tianyu
Schachter, Darren T.
Hu, Zoey
McMahon, Peter L.
Deep physical neural networks trained with backpropagation
title Deep physical neural networks trained with backpropagation
title_full Deep physical neural networks trained with backpropagation
title_fullStr Deep physical neural networks trained with backpropagation
title_full_unstemmed Deep physical neural networks trained with backpropagation
title_short Deep physical neural networks trained with backpropagation
title_sort deep physical neural networks trained with backpropagation
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8791835/
https://www.ncbi.nlm.nih.gov/pubmed/35082422
http://dx.doi.org/10.1038/s41586-021-04223-6
work_keys_str_mv AT wrightlogang deepphysicalneuralnetworkstrainedwithbackpropagation
AT onoderatatsuhiro deepphysicalneuralnetworkstrainedwithbackpropagation
AT steinmartinm deepphysicalneuralnetworkstrainedwithbackpropagation
AT wangtianyu deepphysicalneuralnetworkstrainedwithbackpropagation
AT schachterdarrent deepphysicalneuralnetworkstrainedwithbackpropagation
AT huzoey deepphysicalneuralnetworkstrainedwithbackpropagation
AT mcmahonpeterl deepphysicalneuralnetworkstrainedwithbackpropagation