15 March 2024 | Ahmed Roshdy, Abdullah Karar, Samer Al Kork, Taha Beyrouthy, Amine Nait-ali
This paper addresses the limitations of relying solely on facial expressions for emotion recognition by proposing an advanced approach that emphasizes continuous monitoring of electroencephalography (EEG) signals. The study leverages advancements in deep learning and machine learning to optimize the configuration of EEG electrodes, enhancing the accuracy of emotion classification systems. The proposed multi-input system integrates facial expression analysis with EEG signals, achieving a more comprehensive and accurate understanding of human emotions. The system, using just nine electrodes, outperforms basic emotion recognition setups, with experimental results validating the effectiveness of combining facial expression analysis with EEG signals. The innovative approach holds significance across various sectors, including healthcare, psychology, and human-computer interaction. The paper introduces a novel multi-input system approach, collaboratively fusing two Convolutional Neural Networks (CNNs) to achieve an efficiency of 87.43% when integrated with the DeepFace CNN. The seamless integration of facial expressions and brain topographies enables the system to efficiently harness abundant information from both modalities, ensuring a thorough comprehension of human emotions. By capitalizing on the combined advantages of analyzing facial expressions and EEG-derived brain topography, this technique substantially improves both precision and efficiency in emotion recognition systems. This enhancement establishes a foundation for the introduction of innovative applications across a spectrum of fields.This paper addresses the limitations of relying solely on facial expressions for emotion recognition by proposing an advanced approach that emphasizes continuous monitoring of electroencephalography (EEG) signals. The study leverages advancements in deep learning and machine learning to optimize the configuration of EEG electrodes, enhancing the accuracy of emotion classification systems. The proposed multi-input system integrates facial expression analysis with EEG signals, achieving a more comprehensive and accurate understanding of human emotions. The system, using just nine electrodes, outperforms basic emotion recognition setups, with experimental results validating the effectiveness of combining facial expression analysis with EEG signals. The innovative approach holds significance across various sectors, including healthcare, psychology, and human-computer interaction. The paper introduces a novel multi-input system approach, collaboratively fusing two Convolutional Neural Networks (CNNs) to achieve an efficiency of 87.43% when integrated with the DeepFace CNN. The seamless integration of facial expressions and brain topographies enables the system to efficiently harness abundant information from both modalities, ensuring a thorough comprehension of human emotions. By capitalizing on the combined advantages of analyzing facial expressions and EEG-derived brain topography, this technique substantially improves both precision and efficiency in emotion recognition systems. This enhancement establishes a foundation for the introduction of innovative applications across a spectrum of fields.