Global Consistent Graph Convolutional Network for Hyperspectral Image Classification

Abstract
While semisupervised methods based on graph convolutional networks (GCNs) can achieve good results in hyperspectral image (HSI) classification, their performance is limited as they rely only on spatial-spectral similarity when constructing local adjacency graphs. Moreover, relying on local adjacency graphs can limit the ability of the methods to ensure the consistency of global feature in complex hyperspectral remote sensing environments. Using the spatial-spectral information is typically not sufficient for providing a reliable similarity measurement to construct a global graph when the spectral variability is large and the spatial distance is long in intraclass pixels. To address this issue, this article presents a novel globally consistent GCN (GCGCN) for HSI classification. According to the proposed method, a local reliable initial graph is first built from inherent spatial-spectral information by considering this graph as a variable to optimize. Then, adaptive global high-order neighbors are explored to capture the underlying rich spatial contextual information by utilizing the graph topological consistent connectivity instead of the common strategy of using only the spatial-spectral similarity measurement. Finally, the adaptive global high-order graph structure and two-layer networks are combined to achieve the global feature smoothing of the same class samples and maintain high global feature consistency. The proposed GCGCN method is evaluated on three real HSI data sets to demonstrate its superiority compared to ten different classification methods. Moreover, the proposed GCGCN method achieves state-of-the-art classification results on three data sets in terms of four classification evaluation metrics, including overall accuracy (OA), kappa coefficient (KC), average accuracy (AA), and class accuracy (CA).
Funding Information
  • National Natural Science Foundation of China (62072345, 41671382)
  • LIESMARS Special Research Funding

This publication has 51 references indexed in Scilit: