Cargando…

Dual-Coupled CNN-GCN-Based Classification for Hyperspectral and LiDAR Data

Deep learning techniques have brought substantial performance gains to remote sensing image classification. Among them, convolutional neural networks (CNN) can extract rich spatial and spectral features from hyperspectral images in a short-range region, whereas graph convolutional networks (GCN) can...

Descripción completa

Detalles Bibliográficos
Autores principales: Wang, Lei, Wang, Xili
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9371133/
https://www.ncbi.nlm.nih.gov/pubmed/35957291
http://dx.doi.org/10.3390/s22155735
_version_ 1784767042401861632
author Wang, Lei
Wang, Xili
author_facet Wang, Lei
Wang, Xili
author_sort Wang, Lei
collection PubMed
description Deep learning techniques have brought substantial performance gains to remote sensing image classification. Among them, convolutional neural networks (CNN) can extract rich spatial and spectral features from hyperspectral images in a short-range region, whereas graph convolutional networks (GCN) can model middle- and long-range spatial relations (or structural features) between samples on their graph structure. These different features make it possible to classify remote sensing images finely. In addition, hyperspectral images and light detection and ranging (LiDAR) images can provide spatial-spectral information and elevation information of targets on the Earth’s surface, respectively. These multi-source remote sensing data can further improve classification accuracy in complex scenes. This paper proposes a classification method for HS and LiDAR data based on a dual-coupled CNN-GCN structure. The model can be divided into a coupled CNN and a coupled GCN. The former employs a weight-sharing mechanism to structurally fuse and simplify the dual CNN models and extracting the spatial features from HS and LiDAR data. The latter first concatenates the HS and LiDAR data to construct a uniform graph structure. Then, the dual GCN models perform structural fusion by sharing the graph structures and weight matrices of some layers to extract their structural information, respectively. Finally, the final hybrid features are fed into a standard classifier for the pixel-level classification task under a unified feature fusion module. Extensive experiments on two real-world hyperspectral and LiDAR data demonstrate the effectiveness and superiority of the proposed method compared to other state-of-the-art baseline methods, such as two-branch CNN and context CNN. In particular, the overall accuracy (99.11%) on Trento achieves the best classification performance reported so far.
format Online
Article
Text
id pubmed-9371133
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-93711332022-08-12 Dual-Coupled CNN-GCN-Based Classification for Hyperspectral and LiDAR Data Wang, Lei Wang, Xili Sensors (Basel) Article Deep learning techniques have brought substantial performance gains to remote sensing image classification. Among them, convolutional neural networks (CNN) can extract rich spatial and spectral features from hyperspectral images in a short-range region, whereas graph convolutional networks (GCN) can model middle- and long-range spatial relations (or structural features) between samples on their graph structure. These different features make it possible to classify remote sensing images finely. In addition, hyperspectral images and light detection and ranging (LiDAR) images can provide spatial-spectral information and elevation information of targets on the Earth’s surface, respectively. These multi-source remote sensing data can further improve classification accuracy in complex scenes. This paper proposes a classification method for HS and LiDAR data based on a dual-coupled CNN-GCN structure. The model can be divided into a coupled CNN and a coupled GCN. The former employs a weight-sharing mechanism to structurally fuse and simplify the dual CNN models and extracting the spatial features from HS and LiDAR data. The latter first concatenates the HS and LiDAR data to construct a uniform graph structure. Then, the dual GCN models perform structural fusion by sharing the graph structures and weight matrices of some layers to extract their structural information, respectively. Finally, the final hybrid features are fed into a standard classifier for the pixel-level classification task under a unified feature fusion module. Extensive experiments on two real-world hyperspectral and LiDAR data demonstrate the effectiveness and superiority of the proposed method compared to other state-of-the-art baseline methods, such as two-branch CNN and context CNN. In particular, the overall accuracy (99.11%) on Trento achieves the best classification performance reported so far. MDPI 2022-07-31 /pmc/articles/PMC9371133/ /pubmed/35957291 http://dx.doi.org/10.3390/s22155735 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Wang, Lei
Wang, Xili
Dual-Coupled CNN-GCN-Based Classification for Hyperspectral and LiDAR Data
title Dual-Coupled CNN-GCN-Based Classification for Hyperspectral and LiDAR Data
title_full Dual-Coupled CNN-GCN-Based Classification for Hyperspectral and LiDAR Data
title_fullStr Dual-Coupled CNN-GCN-Based Classification for Hyperspectral and LiDAR Data
title_full_unstemmed Dual-Coupled CNN-GCN-Based Classification for Hyperspectral and LiDAR Data
title_short Dual-Coupled CNN-GCN-Based Classification for Hyperspectral and LiDAR Data
title_sort dual-coupled cnn-gcn-based classification for hyperspectral and lidar data
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9371133/
https://www.ncbi.nlm.nih.gov/pubmed/35957291
http://dx.doi.org/10.3390/s22155735
work_keys_str_mv AT wanglei dualcoupledcnngcnbasedclassificationforhyperspectralandlidardata
AT wangxili dualcoupledcnngcnbasedclassificationforhyperspectralandlidardata