Cargando…

Network Representation Learning With Community Awareness and Its Applications in Brain Networks

Previously network representation learning methods mainly focus on exploring the microscopic structure, i.e., the pairwise relationship or similarity between nodes. However, the mesoscopic structure, i.e., community structure, an essential property in real networks, has not been thoroughly studied i...

Descripción completa

Detalles Bibliográficos
Autores principales: Shi, Min, Qu, Bo, Li, Xiang, Li, Cong
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9196130/
https://www.ncbi.nlm.nih.gov/pubmed/35711311
http://dx.doi.org/10.3389/fphys.2022.910873
_version_ 1784727115368759296
author Shi, Min
Qu, Bo
Li, Xiang
Li, Cong
author_facet Shi, Min
Qu, Bo
Li, Xiang
Li, Cong
author_sort Shi, Min
collection PubMed
description Previously network representation learning methods mainly focus on exploring the microscopic structure, i.e., the pairwise relationship or similarity between nodes. However, the mesoscopic structure, i.e., community structure, an essential property in real networks, has not been thoroughly studied in the network representation learning. We here propose a deep attributed network representation learning with community awareness (DANRL-CA) framework. Specifically, we design a neighborhood enhancement autoencoder module to capture the 2-step relations between node pairs. To explore the multi-step relations, we construct a community-aware skip-gram module based on the encoder. We introduce two variants of DANRL-CA, namely, DANRL-CA-AM and DANRL-CA-CSM, which incorporate the community information and attribute semantics into node neighbors with different methods. We compare two variant models with the state-of-the-art methods on four datasets for node classification and link prediction. Especially, we apply our models on a brain network. The superiority indicates the scalability and effectiveness of our method on various networks. Compared with DANRL-CA-AM, DANRL-CA-CSM can more flexibly coordinate the role of node attributes and community information in the process of network representation learning, and shows superiority in the networks with sparse topological structure and node attributes.
format Online
Article
Text
id pubmed-9196130
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-91961302022-06-15 Network Representation Learning With Community Awareness and Its Applications in Brain Networks Shi, Min Qu, Bo Li, Xiang Li, Cong Front Physiol Physiology Previously network representation learning methods mainly focus on exploring the microscopic structure, i.e., the pairwise relationship or similarity between nodes. However, the mesoscopic structure, i.e., community structure, an essential property in real networks, has not been thoroughly studied in the network representation learning. We here propose a deep attributed network representation learning with community awareness (DANRL-CA) framework. Specifically, we design a neighborhood enhancement autoencoder module to capture the 2-step relations between node pairs. To explore the multi-step relations, we construct a community-aware skip-gram module based on the encoder. We introduce two variants of DANRL-CA, namely, DANRL-CA-AM and DANRL-CA-CSM, which incorporate the community information and attribute semantics into node neighbors with different methods. We compare two variant models with the state-of-the-art methods on four datasets for node classification and link prediction. Especially, we apply our models on a brain network. The superiority indicates the scalability and effectiveness of our method on various networks. Compared with DANRL-CA-AM, DANRL-CA-CSM can more flexibly coordinate the role of node attributes and community information in the process of network representation learning, and shows superiority in the networks with sparse topological structure and node attributes. Frontiers Media S.A. 2022-05-27 /pmc/articles/PMC9196130/ /pubmed/35711311 http://dx.doi.org/10.3389/fphys.2022.910873 Text en Copyright © 2022 Shi, Qu, Li and Li. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Physiology
Shi, Min
Qu, Bo
Li, Xiang
Li, Cong
Network Representation Learning With Community Awareness and Its Applications in Brain Networks
title Network Representation Learning With Community Awareness and Its Applications in Brain Networks
title_full Network Representation Learning With Community Awareness and Its Applications in Brain Networks
title_fullStr Network Representation Learning With Community Awareness and Its Applications in Brain Networks
title_full_unstemmed Network Representation Learning With Community Awareness and Its Applications in Brain Networks
title_short Network Representation Learning With Community Awareness and Its Applications in Brain Networks
title_sort network representation learning with community awareness and its applications in brain networks
topic Physiology
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9196130/
https://www.ncbi.nlm.nih.gov/pubmed/35711311
http://dx.doi.org/10.3389/fphys.2022.910873
work_keys_str_mv AT shimin networkrepresentationlearningwithcommunityawarenessanditsapplicationsinbrainnetworks
AT qubo networkrepresentationlearningwithcommunityawarenessanditsapplicationsinbrainnetworks
AT lixiang networkrepresentationlearningwithcommunityawarenessanditsapplicationsinbrainnetworks
AT licong networkrepresentationlearningwithcommunityawarenessanditsapplicationsinbrainnetworks