Cargando…

Random projection forest initialization for graph convolutional networks

Graph convolutional networks (GCNs) were a great step towards extending deep learning to graphs. GCN uses the graph [Formula: see text] and the feature matrix [Formula: see text] as inputs. However, in most cases the graph [Formula: see text] is missing and we are only provided with the feature matr...

Descripción completa

Detalles Bibliográficos
Autores principales: Alshammari, Mashaan, Stavrakakis, John, Ahmed, Adel F., Takatsuka, Masahiro
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Elsevier 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10433121/
https://www.ncbi.nlm.nih.gov/pubmed/37601292
http://dx.doi.org/10.1016/j.mex.2023.102315
_version_ 1785091580415180800
author Alshammari, Mashaan
Stavrakakis, John
Ahmed, Adel F.
Takatsuka, Masahiro
author_facet Alshammari, Mashaan
Stavrakakis, John
Ahmed, Adel F.
Takatsuka, Masahiro
author_sort Alshammari, Mashaan
collection PubMed
description Graph convolutional networks (GCNs) were a great step towards extending deep learning to graphs. GCN uses the graph [Formula: see text] and the feature matrix [Formula: see text] as inputs. However, in most cases the graph [Formula: see text] is missing and we are only provided with the feature matrix [Formula: see text]. To solve this problem, classical graphs such as [Formula: see text]-nearest neighbor ([Formula: see text]-nn) are usually used to construct the graph [Formula: see text] and initialize the GCN. Although it is computationally efficient to construct [Formula: see text]-nn graphs, the constructed graph might not be very useful for learning. In a [Formula: see text]-nn graph, points are restricted to have a fixed number of edges, and all edges in the graph have equal weights. Our contribution is Initializing GCN using a graph with varying weights on edges, which provides better performance compared to [Formula: see text]-nn initialization. Our proposed method is based on random projection forest (rpForest). rpForest enables us to assign varying weights on edges indicating varying importance, which enhanced the learning. The number of trees is a hyperparameter in rpForest. We performed spectral analysis to help us setting this parameter in the right range. In the experiments, initializing the GCN using rpForest provides better results compared to [Formula: see text]-nn initialization. • Constructing the graph [Formula: see text] using rpForest sets varying weights on edges, which represents the similarity between a pair of samples.Unlike [Formula: see text]-nearest neighbor graph where all weights are equal. • Using rpForest graph to initialize GCN provides better results compared to [Formula: see text]-nn initialization. The varying weights in rpForest graph quantify the similarity between samples, which guided the GCN training to deliver better results. • The rpForest graph involves the tuning of the hyperparameter (number of trees [Formula: see text]). We provided an informative way to set this hyperparameter through spectral analysis.
format Online
Article
Text
id pubmed-10433121
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Elsevier
record_format MEDLINE/PubMed
spelling pubmed-104331212023-08-18 Random projection forest initialization for graph convolutional networks Alshammari, Mashaan Stavrakakis, John Ahmed, Adel F. Takatsuka, Masahiro MethodsX Computer Science Graph convolutional networks (GCNs) were a great step towards extending deep learning to graphs. GCN uses the graph [Formula: see text] and the feature matrix [Formula: see text] as inputs. However, in most cases the graph [Formula: see text] is missing and we are only provided with the feature matrix [Formula: see text]. To solve this problem, classical graphs such as [Formula: see text]-nearest neighbor ([Formula: see text]-nn) are usually used to construct the graph [Formula: see text] and initialize the GCN. Although it is computationally efficient to construct [Formula: see text]-nn graphs, the constructed graph might not be very useful for learning. In a [Formula: see text]-nn graph, points are restricted to have a fixed number of edges, and all edges in the graph have equal weights. Our contribution is Initializing GCN using a graph with varying weights on edges, which provides better performance compared to [Formula: see text]-nn initialization. Our proposed method is based on random projection forest (rpForest). rpForest enables us to assign varying weights on edges indicating varying importance, which enhanced the learning. The number of trees is a hyperparameter in rpForest. We performed spectral analysis to help us setting this parameter in the right range. In the experiments, initializing the GCN using rpForest provides better results compared to [Formula: see text]-nn initialization. • Constructing the graph [Formula: see text] using rpForest sets varying weights on edges, which represents the similarity between a pair of samples.Unlike [Formula: see text]-nearest neighbor graph where all weights are equal. • Using rpForest graph to initialize GCN provides better results compared to [Formula: see text]-nn initialization. The varying weights in rpForest graph quantify the similarity between samples, which guided the GCN training to deliver better results. • The rpForest graph involves the tuning of the hyperparameter (number of trees [Formula: see text]). We provided an informative way to set this hyperparameter through spectral analysis. Elsevier 2023-08-05 /pmc/articles/PMC10433121/ /pubmed/37601292 http://dx.doi.org/10.1016/j.mex.2023.102315 Text en © 2023 The Author(s) https://creativecommons.org/licenses/by/4.0/This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Computer Science
Alshammari, Mashaan
Stavrakakis, John
Ahmed, Adel F.
Takatsuka, Masahiro
Random projection forest initialization for graph convolutional networks
title Random projection forest initialization for graph convolutional networks
title_full Random projection forest initialization for graph convolutional networks
title_fullStr Random projection forest initialization for graph convolutional networks
title_full_unstemmed Random projection forest initialization for graph convolutional networks
title_short Random projection forest initialization for graph convolutional networks
title_sort random projection forest initialization for graph convolutional networks
topic Computer Science
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10433121/
https://www.ncbi.nlm.nih.gov/pubmed/37601292
http://dx.doi.org/10.1016/j.mex.2023.102315
work_keys_str_mv AT alshammarimashaan randomprojectionforestinitializationforgraphconvolutionalnetworks
AT stavrakakisjohn randomprojectionforestinitializationforgraphconvolutionalnetworks
AT ahmedadelf randomprojectionforestinitializationforgraphconvolutionalnetworks
AT takatsukamasahiro randomprojectionforestinitializationforgraphconvolutionalnetworks