Cargando…

Multimodal transistors as ReLU activation functions in physical neural network classifiers

Artificial neural networks (ANNs) providing sophisticated, power-efficient classification are finding their way into thin-film electronics. Thin-film technologies require robust, layout-efficient devices with facile manufacturability. Here, we show how the multimodal transistor’s (MMT’s) transfer ch...

Descripción completa

Detalles Bibliográficos
Autores principales: Surekcigil Pesch, Isin, Bestelink, Eva, de Sagazan, Olivier, Mehonic, Adnan, Sporea, Radu A.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8758690/
https://www.ncbi.nlm.nih.gov/pubmed/35027631
http://dx.doi.org/10.1038/s41598-021-04614-9
_version_ 1784632966823018496
author Surekcigil Pesch, Isin
Bestelink, Eva
de Sagazan, Olivier
Mehonic, Adnan
Sporea, Radu A.
author_facet Surekcigil Pesch, Isin
Bestelink, Eva
de Sagazan, Olivier
Mehonic, Adnan
Sporea, Radu A.
author_sort Surekcigil Pesch, Isin
collection PubMed
description Artificial neural networks (ANNs) providing sophisticated, power-efficient classification are finding their way into thin-film electronics. Thin-film technologies require robust, layout-efficient devices with facile manufacturability. Here, we show how the multimodal transistor’s (MMT’s) transfer characteristic, with linear dependence in saturation, replicates the rectified linear unit (ReLU) activation function of convolutional ANNs (CNNs). Using MATLAB, we evaluate CNN performance using systematically distorted ReLU functions, then substitute measured and simulated MMT transfer characteristics as proxies for ReLU. High classification accuracy is maintained, despite large variations in geometrical and electrical parameters, as CNNs use the same activation functions for training and classification.
format Online
Article
Text
id pubmed-8758690
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-87586902022-01-14 Multimodal transistors as ReLU activation functions in physical neural network classifiers Surekcigil Pesch, Isin Bestelink, Eva de Sagazan, Olivier Mehonic, Adnan Sporea, Radu A. Sci Rep Article Artificial neural networks (ANNs) providing sophisticated, power-efficient classification are finding their way into thin-film electronics. Thin-film technologies require robust, layout-efficient devices with facile manufacturability. Here, we show how the multimodal transistor’s (MMT’s) transfer characteristic, with linear dependence in saturation, replicates the rectified linear unit (ReLU) activation function of convolutional ANNs (CNNs). Using MATLAB, we evaluate CNN performance using systematically distorted ReLU functions, then substitute measured and simulated MMT transfer characteristics as proxies for ReLU. High classification accuracy is maintained, despite large variations in geometrical and electrical parameters, as CNNs use the same activation functions for training and classification. Nature Publishing Group UK 2022-01-13 /pmc/articles/PMC8758690/ /pubmed/35027631 http://dx.doi.org/10.1038/s41598-021-04614-9 Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Surekcigil Pesch, Isin
Bestelink, Eva
de Sagazan, Olivier
Mehonic, Adnan
Sporea, Radu A.
Multimodal transistors as ReLU activation functions in physical neural network classifiers
title Multimodal transistors as ReLU activation functions in physical neural network classifiers
title_full Multimodal transistors as ReLU activation functions in physical neural network classifiers
title_fullStr Multimodal transistors as ReLU activation functions in physical neural network classifiers
title_full_unstemmed Multimodal transistors as ReLU activation functions in physical neural network classifiers
title_short Multimodal transistors as ReLU activation functions in physical neural network classifiers
title_sort multimodal transistors as relu activation functions in physical neural network classifiers
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8758690/
https://www.ncbi.nlm.nih.gov/pubmed/35027631
http://dx.doi.org/10.1038/s41598-021-04614-9
work_keys_str_mv AT surekcigilpeschisin multimodaltransistorsasreluactivationfunctionsinphysicalneuralnetworkclassifiers
AT bestelinkeva multimodaltransistorsasreluactivationfunctionsinphysicalneuralnetworkclassifiers
AT desagazanolivier multimodaltransistorsasreluactivationfunctionsinphysicalneuralnetworkclassifiers
AT mehonicadnan multimodaltransistorsasreluactivationfunctionsinphysicalneuralnetworkclassifiers
AT sporearadua multimodaltransistorsasreluactivationfunctionsinphysicalneuralnetworkclassifiers