Cargando…
Self-Supervised Node Classification with Strategy and Actively Selected Labeled Set
To alleviate the impact of insufficient labels in less-labeled classification problems, self-supervised learning improves the performance of graph neural networks (GNNs) by focusing on the information of unlabeled nodes. However, none of the existing self-supervised pretext tasks perform optimally o...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9857737/ https://www.ncbi.nlm.nih.gov/pubmed/36673172 http://dx.doi.org/10.3390/e25010030 |
_version_ | 1784873932687409152 |
---|---|
author | Kang, Yi Liu, Ke Cao, Zhiyuan Zhang, Jiacai |
author_facet | Kang, Yi Liu, Ke Cao, Zhiyuan Zhang, Jiacai |
author_sort | Kang, Yi |
collection | PubMed |
description | To alleviate the impact of insufficient labels in less-labeled classification problems, self-supervised learning improves the performance of graph neural networks (GNNs) by focusing on the information of unlabeled nodes. However, none of the existing self-supervised pretext tasks perform optimally on different datasets, and the choice of hyperparameters is also included when combining self-supervised and supervised tasks. To select the best-performing self-supervised pretext task for each dataset and optimize the hyperparameters with no expert experience needed, we propose a novel auto graph self-supervised learning framework and enhance this framework with a one-shot active learning method. Experimental results on three real world citation datasets show that training GNNs with automatically optimized pretext tasks can achieve or even surpass the classification accuracy obtained with manually designed pretext tasks. On this basis, compared with using randomly selected labeled nodes, using actively selected labeled nodes can further improve the classification performance of GNNs. Both the active selection and the automatic optimization contribute to semi-supervised node classification. |
format | Online Article Text |
id | pubmed-9857737 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-98577372023-01-21 Self-Supervised Node Classification with Strategy and Actively Selected Labeled Set Kang, Yi Liu, Ke Cao, Zhiyuan Zhang, Jiacai Entropy (Basel) Article To alleviate the impact of insufficient labels in less-labeled classification problems, self-supervised learning improves the performance of graph neural networks (GNNs) by focusing on the information of unlabeled nodes. However, none of the existing self-supervised pretext tasks perform optimally on different datasets, and the choice of hyperparameters is also included when combining self-supervised and supervised tasks. To select the best-performing self-supervised pretext task for each dataset and optimize the hyperparameters with no expert experience needed, we propose a novel auto graph self-supervised learning framework and enhance this framework with a one-shot active learning method. Experimental results on three real world citation datasets show that training GNNs with automatically optimized pretext tasks can achieve or even surpass the classification accuracy obtained with manually designed pretext tasks. On this basis, compared with using randomly selected labeled nodes, using actively selected labeled nodes can further improve the classification performance of GNNs. Both the active selection and the automatic optimization contribute to semi-supervised node classification. MDPI 2022-12-23 /pmc/articles/PMC9857737/ /pubmed/36673172 http://dx.doi.org/10.3390/e25010030 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Kang, Yi Liu, Ke Cao, Zhiyuan Zhang, Jiacai Self-Supervised Node Classification with Strategy and Actively Selected Labeled Set |
title | Self-Supervised Node Classification with Strategy and Actively Selected Labeled Set |
title_full | Self-Supervised Node Classification with Strategy and Actively Selected Labeled Set |
title_fullStr | Self-Supervised Node Classification with Strategy and Actively Selected Labeled Set |
title_full_unstemmed | Self-Supervised Node Classification with Strategy and Actively Selected Labeled Set |
title_short | Self-Supervised Node Classification with Strategy and Actively Selected Labeled Set |
title_sort | self-supervised node classification with strategy and actively selected labeled set |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9857737/ https://www.ncbi.nlm.nih.gov/pubmed/36673172 http://dx.doi.org/10.3390/e25010030 |
work_keys_str_mv | AT kangyi selfsupervisednodeclassificationwithstrategyandactivelyselectedlabeledset AT liuke selfsupervisednodeclassificationwithstrategyandactivelyselectedlabeledset AT caozhiyuan selfsupervisednodeclassificationwithstrategyandactivelyselectedlabeledset AT zhangjiacai selfsupervisednodeclassificationwithstrategyandactivelyselectedlabeledset |