Cargando…

Power-law scaling to assist with key challenges in artificial intelligence

Power-law scaling, a central concept in critical phenomena, is found to be useful in deep learning, where optimized test errors on handwritten digit examples converge as a power-law to zero with database size. For rapid decision making with one training epoch, each example is presented only once to...

Descripción completa

Detalles Bibliográficos
Autores principales: Meir, Yuval, Sardi, Shira, Hodassman, Shiri, Kisos, Karin, Ben-Noam, Itamar, Goldental, Amir, Kanter, Ido
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7665018/
https://www.ncbi.nlm.nih.gov/pubmed/33184422
http://dx.doi.org/10.1038/s41598-020-76764-1
_version_ 1783609940856799232
author Meir, Yuval
Sardi, Shira
Hodassman, Shiri
Kisos, Karin
Ben-Noam, Itamar
Goldental, Amir
Kanter, Ido
author_facet Meir, Yuval
Sardi, Shira
Hodassman, Shiri
Kisos, Karin
Ben-Noam, Itamar
Goldental, Amir
Kanter, Ido
author_sort Meir, Yuval
collection PubMed
description Power-law scaling, a central concept in critical phenomena, is found to be useful in deep learning, where optimized test errors on handwritten digit examples converge as a power-law to zero with database size. For rapid decision making with one training epoch, each example is presented only once to the trained network, the power-law exponent increased with the number of hidden layers. For the largest dataset, the obtained test error was estimated to be in the proximity of state-of-the-art algorithms for large epoch numbers. Power-law scaling assists with key challenges found in current artificial intelligence applications and facilitates an a priori dataset size estimation to achieve a desired test accuracy. It establishes a benchmark for measuring training complexity and a quantitative hierarchy of machine learning tasks and algorithms.
format Online
Article
Text
id pubmed-7665018
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-76650182020-11-16 Power-law scaling to assist with key challenges in artificial intelligence Meir, Yuval Sardi, Shira Hodassman, Shiri Kisos, Karin Ben-Noam, Itamar Goldental, Amir Kanter, Ido Sci Rep Article Power-law scaling, a central concept in critical phenomena, is found to be useful in deep learning, where optimized test errors on handwritten digit examples converge as a power-law to zero with database size. For rapid decision making with one training epoch, each example is presented only once to the trained network, the power-law exponent increased with the number of hidden layers. For the largest dataset, the obtained test error was estimated to be in the proximity of state-of-the-art algorithms for large epoch numbers. Power-law scaling assists with key challenges found in current artificial intelligence applications and facilitates an a priori dataset size estimation to achieve a desired test accuracy. It establishes a benchmark for measuring training complexity and a quantitative hierarchy of machine learning tasks and algorithms. Nature Publishing Group UK 2020-11-12 /pmc/articles/PMC7665018/ /pubmed/33184422 http://dx.doi.org/10.1038/s41598-020-76764-1 Text en © The Author(s) 2020 Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
spellingShingle Article
Meir, Yuval
Sardi, Shira
Hodassman, Shiri
Kisos, Karin
Ben-Noam, Itamar
Goldental, Amir
Kanter, Ido
Power-law scaling to assist with key challenges in artificial intelligence
title Power-law scaling to assist with key challenges in artificial intelligence
title_full Power-law scaling to assist with key challenges in artificial intelligence
title_fullStr Power-law scaling to assist with key challenges in artificial intelligence
title_full_unstemmed Power-law scaling to assist with key challenges in artificial intelligence
title_short Power-law scaling to assist with key challenges in artificial intelligence
title_sort power-law scaling to assist with key challenges in artificial intelligence
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7665018/
https://www.ncbi.nlm.nih.gov/pubmed/33184422
http://dx.doi.org/10.1038/s41598-020-76764-1
work_keys_str_mv AT meiryuval powerlawscalingtoassistwithkeychallengesinartificialintelligence
AT sardishira powerlawscalingtoassistwithkeychallengesinartificialintelligence
AT hodassmanshiri powerlawscalingtoassistwithkeychallengesinartificialintelligence
AT kisoskarin powerlawscalingtoassistwithkeychallengesinartificialintelligence
AT bennoamitamar powerlawscalingtoassistwithkeychallengesinartificialintelligence
AT goldentalamir powerlawscalingtoassistwithkeychallengesinartificialintelligence
AT kanterido powerlawscalingtoassistwithkeychallengesinartificialintelligence