Cargando…

Barriers for the performance of graph neural networks (GNN) in discrete random structures

Recently, graph neural network (GNN)-based algorithms were proposed to solve a variety of combinatorial optimization problems [M. J. Schuetz, J. K. Brubaker, H. G. Katzgraber, Nat. Mach. Intell.4, 367–377 (2022)]. GNN was tested in particular on randomly generated instances of these problems. The pu...

Descripción completa

Detalles Bibliográficos
Autor principal: Gamarnik, David
Formato: Online Artículo Texto
Lenguaje:English
Publicado: National Academy of Sciences 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10655568/
https://www.ncbi.nlm.nih.gov/pubmed/37931095
http://dx.doi.org/10.1073/pnas.2314092120
_version_ 1785136850080366592
author Gamarnik, David
author_facet Gamarnik, David
author_sort Gamarnik, David
collection PubMed
description Recently, graph neural network (GNN)-based algorithms were proposed to solve a variety of combinatorial optimization problems [M. J. Schuetz, J. K. Brubaker, H. G. Katzgraber, Nat. Mach. Intell.4, 367–377 (2022)]. GNN was tested in particular on randomly generated instances of these problems. The publication [M. J. Schuetz, J. K. Brubaker, H. G. Katzgraber, Nat. Mach. Intell.4, 367–377 (2022)] stirred a debate whether the GNN-based method was adequately benchmarked against best prior methods. In particular, critical commentaries [M. C. Angelini, F. Ricci-Tersenghi, Nat. Mach. Intell.5, 29–31 (2023)] and [S. Boettcher, Nat. Mach. Intell.5, 24–25 (2023)] point out that a simple greedy algorithm performs better than the GNN. We do not intend to discuss the merits of arguments and counterarguments in these papers. Rather, in this note, we establish a fundamental limitation for running GNN on random instances considered in these references, for a broad range of choices of GNN architecture. Specifically, these barriers hold when the depth of GNN does not scale with graph size (we note that depth 2 was used in experiments in [M. J. Schuetz, J. K. Brubaker, H. G. Katzgraber, Nat. Mach. Intell.4, 367–377 (2022)]), and importantly, these barriers hold regardless of any other parameters of GNN architecture. These limitations arise from the presence of the overlap gap property (OGP) phase transition, which is a barrier for many algorithms, including importantly local algorithms, of which GNN is an example. At the same time, some algorithms known prior to the introduction of GNN provide best results for these problems up to the OGP phase transition. This leaves very little space for GNN to outperform the known algorithms, and based on this, we side with the conclusions made in [M. C. Angelini, F. Ricci-Tersenghi, Nat. Mach. Intell.5, 29–31 (2023)] and [S. Boettcher, Nat. Mach. Intell.5, 24–25 (2023)].
format Online
Article
Text
id pubmed-10655568
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher National Academy of Sciences
record_format MEDLINE/PubMed
spelling pubmed-106555682023-11-06 Barriers for the performance of graph neural networks (GNN) in discrete random structures Gamarnik, David Proc Natl Acad Sci U S A Physical Sciences Recently, graph neural network (GNN)-based algorithms were proposed to solve a variety of combinatorial optimization problems [M. J. Schuetz, J. K. Brubaker, H. G. Katzgraber, Nat. Mach. Intell.4, 367–377 (2022)]. GNN was tested in particular on randomly generated instances of these problems. The publication [M. J. Schuetz, J. K. Brubaker, H. G. Katzgraber, Nat. Mach. Intell.4, 367–377 (2022)] stirred a debate whether the GNN-based method was adequately benchmarked against best prior methods. In particular, critical commentaries [M. C. Angelini, F. Ricci-Tersenghi, Nat. Mach. Intell.5, 29–31 (2023)] and [S. Boettcher, Nat. Mach. Intell.5, 24–25 (2023)] point out that a simple greedy algorithm performs better than the GNN. We do not intend to discuss the merits of arguments and counterarguments in these papers. Rather, in this note, we establish a fundamental limitation for running GNN on random instances considered in these references, for a broad range of choices of GNN architecture. Specifically, these barriers hold when the depth of GNN does not scale with graph size (we note that depth 2 was used in experiments in [M. J. Schuetz, J. K. Brubaker, H. G. Katzgraber, Nat. Mach. Intell.4, 367–377 (2022)]), and importantly, these barriers hold regardless of any other parameters of GNN architecture. These limitations arise from the presence of the overlap gap property (OGP) phase transition, which is a barrier for many algorithms, including importantly local algorithms, of which GNN is an example. At the same time, some algorithms known prior to the introduction of GNN provide best results for these problems up to the OGP phase transition. This leaves very little space for GNN to outperform the known algorithms, and based on this, we side with the conclusions made in [M. C. Angelini, F. Ricci-Tersenghi, Nat. Mach. Intell.5, 29–31 (2023)] and [S. Boettcher, Nat. Mach. Intell.5, 24–25 (2023)]. National Academy of Sciences 2023-11-06 2023-11-14 /pmc/articles/PMC10655568/ /pubmed/37931095 http://dx.doi.org/10.1073/pnas.2314092120 Text en Copyright © 2023 the Author(s). Published by PNAS. https://creativecommons.org/licenses/by-nc-nd/4.0/This open access article is distributed under Creative Commons Attribution-NonCommercial-NoDerivatives License 4.0 (CC BY-NC-ND) (https://creativecommons.org/licenses/by-nc-nd/4.0/) .
spellingShingle Physical Sciences
Gamarnik, David
Barriers for the performance of graph neural networks (GNN) in discrete random structures
title Barriers for the performance of graph neural networks (GNN) in discrete random structures
title_full Barriers for the performance of graph neural networks (GNN) in discrete random structures
title_fullStr Barriers for the performance of graph neural networks (GNN) in discrete random structures
title_full_unstemmed Barriers for the performance of graph neural networks (GNN) in discrete random structures
title_short Barriers for the performance of graph neural networks (GNN) in discrete random structures
title_sort barriers for the performance of graph neural networks (gnn) in discrete random structures
topic Physical Sciences
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10655568/
https://www.ncbi.nlm.nih.gov/pubmed/37931095
http://dx.doi.org/10.1073/pnas.2314092120
work_keys_str_mv AT gamarnikdavid barriersfortheperformanceofgraphneuralnetworksgnnindiscreterandomstructures