Deep Recurrent Neural Networks for Hyperspectral Image Classification

Deep Recurrent Neural Networks for Hyperspectral Image Classification

VOL. 55, NO. 7, JULY 2017 | Lichao Mou, Student Member, IEEE, Pedram Ghamisi, Member, IEEE, and Xiao Xiang Zhu, Senior Member, IEEE
This paper proposes a novel recurrent neural network (RNN) model for hyperspectral image classification, which effectively analyzes hyperspectral pixels as sequential data and determines information categories through network reasoning. The proposed RNN uses a newly designed activation function, parametric rectified tanh (PREtanh), and a modified gated recurrent unit (GRU) to process hyperspectral data efficiently. The PREtanh function addresses the limitations of traditional activation functions like tanh and ReLU by producing bounded outputs and promoting sparse representations, allowing for higher learning rates without divergence. The modified GRU, which uses PREtanh for hidden representation, reduces the number of parameters and is more suitable for small training datasets. Experimental results on three airborne hyperspectral images (Pavia University, Houston, and Indian Pines) demonstrate the competitive performance of the proposed RNN, outperforming state-of-the-art methods in terms of overall accuracy, average accuracy, and Kappa coefficient. The proposed network also shows superior performance in distinguishing similar materials, particularly in challenging datasets like the Indian Pines, which is small and unbalanced. Additionally, the RNN's testing efficiency is significantly faster compared to other methods, making it a promising approach for hyperspectral image classification.This paper proposes a novel recurrent neural network (RNN) model for hyperspectral image classification, which effectively analyzes hyperspectral pixels as sequential data and determines information categories through network reasoning. The proposed RNN uses a newly designed activation function, parametric rectified tanh (PREtanh), and a modified gated recurrent unit (GRU) to process hyperspectral data efficiently. The PREtanh function addresses the limitations of traditional activation functions like tanh and ReLU by producing bounded outputs and promoting sparse representations, allowing for higher learning rates without divergence. The modified GRU, which uses PREtanh for hidden representation, reduces the number of parameters and is more suitable for small training datasets. Experimental results on three airborne hyperspectral images (Pavia University, Houston, and Indian Pines) demonstrate the competitive performance of the proposed RNN, outperforming state-of-the-art methods in terms of overall accuracy, average accuracy, and Kappa coefficient. The proposed network also shows superior performance in distinguishing similar materials, particularly in challenging datasets like the Indian Pines, which is small and unbalanced. Additionally, the RNN's testing efficiency is significantly faster compared to other methods, making it a promising approach for hyperspectral image classification.
Reach us at info@study.space
[slides and audio] Deep Recurrent Neural Networks for Hyperspectral Image Classification