Subject-Independent Emotion Recognition Based on EEG Frequency Band Features and Self-Adaptive Graph Construction

Subject-Independent Emotion Recognition Based on EEG Frequency Band Features and Self-Adaptive Graph Construction

12 March 2024 | Jinhao Zhang, Yanrong Hao, Xin Wen, Chenchen Zhang, Haojie Deng, Juanjuan Zhao and Rui Cao
This paper proposes a subject-independent emotion recognition model based on EEG frequency band features and self-adaptive graph construction, called BFE-Net. The model is designed to extract EEG features using a multi-graphic layer construction module to obtain a frequency band-based multi-graphic layer emotion representation. The model is evaluated on two public datasets, SEED and SEED-IV, and shows superior performance in most experimental settings compared to existing studies. The visualization of brain connectivity patterns reveals that some findings are consistent with previous neuroscientific validations, further validating the model in subject-independent emotion recognition studies. The model uses a combination of CNN and Transformer models to adaptively extract the frequency band graph layer structure. The model also uses a Graph Convolutional Neural Network (GCN) to aggregate features and obtain a single-band representation of emotion. The model is tested using a leave-one-subject-out (LOSO) cross-validation approach, and the results show that the model achieves more stable performance in subject-independent emotion recognition studies. The model's performance is evaluated using accuracy and standard deviation metrics, and the results show that the model achieves higher accuracy and lower standard deviation compared to other methods. The model's performance is also validated using a confusion matrix and visualization of the adjacency matrix generated by the model. The model's results demonstrate its effectiveness in subject-independent emotion recognition based on EEG signals.This paper proposes a subject-independent emotion recognition model based on EEG frequency band features and self-adaptive graph construction, called BFE-Net. The model is designed to extract EEG features using a multi-graphic layer construction module to obtain a frequency band-based multi-graphic layer emotion representation. The model is evaluated on two public datasets, SEED and SEED-IV, and shows superior performance in most experimental settings compared to existing studies. The visualization of brain connectivity patterns reveals that some findings are consistent with previous neuroscientific validations, further validating the model in subject-independent emotion recognition studies. The model uses a combination of CNN and Transformer models to adaptively extract the frequency band graph layer structure. The model also uses a Graph Convolutional Neural Network (GCN) to aggregate features and obtain a single-band representation of emotion. The model is tested using a leave-one-subject-out (LOSO) cross-validation approach, and the results show that the model achieves more stable performance in subject-independent emotion recognition studies. The model's performance is evaluated using accuracy and standard deviation metrics, and the results show that the model achieves higher accuracy and lower standard deviation compared to other methods. The model's performance is also validated using a confusion matrix and visualization of the adjacency matrix generated by the model. The model's results demonstrate its effectiveness in subject-independent emotion recognition based on EEG signals.
Reach us at info@study.space
Understanding Subject-Independent Emotion Recognition Based on EEG Frequency Band Features and Self-Adaptive Graph Construction