15 March 2024 | Ahmed Roshdy, Abdullah Karar, Samer Al Kork, Taha Beyrouthy, Amine Nait-ali
This paper presents an advanced approach to emotion recognition that integrates EEG signals with facial expression analysis to enhance the accuracy and effectiveness of emotion classification systems. The study addresses the limitations of relying solely on facial expressions by leveraging the potential of EEG signals, which are less susceptible to deception. The proposed system uses a multi-input approach, combining EEG-based Convolutional Neural Networks (CNNs) with facial expression recognition, to achieve a more comprehensive understanding of human emotions. The system employs nine electrodes and utilizes brain heat map topographies to capture and analyze brain activity, resulting in improved accuracy compared to traditional emotion recognition methods. Experimental results show that integrating facial expression analysis with EEG signals provides a more accurate and comprehensive understanding of human emotions. The study introduces a novel multi-input system that fuses two powerful deep learning algorithms: two CNNs. The proposed EEG-based CNN algorithm achieves an efficiency of 87.43%, rising to 91.21% when integrated with the DeepFace CNN. The seamless integration of facial expressions and brain topographies enables the system to efficiently harness abundant information from both modalities, ensuring a thorough comprehension of human emotions. This innovative approach holds significance across various sectors, including healthcare, psychology, and human-computer interaction. The paper also discusses the importance of EEG in emotion recognition, highlighting its non-invasive nature and ability to capture real-time brain activity. The study emphasizes the role of deep learning and machine learning in extracting meaningful features from EEG data for emotion classification. The proposed system uses a multi-modal approach, combining EEG signals with facial expressions to enhance the accuracy of emotion recognition. The study also addresses the challenges of EEG-based emotion recognition, including noise and artifacts, and proposes solutions such as optimizing electrode placement and using advanced filtering techniques. The paper concludes with the results of the proposed system, demonstrating its effectiveness in recognizing emotions with high accuracy. The study highlights the potential of EEG-based emotion recognition in various applications, including affective computing and mental health monitoring. The integration of EEG and facial expression analysis offers a promising avenue for future research in emotion recognition.This paper presents an advanced approach to emotion recognition that integrates EEG signals with facial expression analysis to enhance the accuracy and effectiveness of emotion classification systems. The study addresses the limitations of relying solely on facial expressions by leveraging the potential of EEG signals, which are less susceptible to deception. The proposed system uses a multi-input approach, combining EEG-based Convolutional Neural Networks (CNNs) with facial expression recognition, to achieve a more comprehensive understanding of human emotions. The system employs nine electrodes and utilizes brain heat map topographies to capture and analyze brain activity, resulting in improved accuracy compared to traditional emotion recognition methods. Experimental results show that integrating facial expression analysis with EEG signals provides a more accurate and comprehensive understanding of human emotions. The study introduces a novel multi-input system that fuses two powerful deep learning algorithms: two CNNs. The proposed EEG-based CNN algorithm achieves an efficiency of 87.43%, rising to 91.21% when integrated with the DeepFace CNN. The seamless integration of facial expressions and brain topographies enables the system to efficiently harness abundant information from both modalities, ensuring a thorough comprehension of human emotions. This innovative approach holds significance across various sectors, including healthcare, psychology, and human-computer interaction. The paper also discusses the importance of EEG in emotion recognition, highlighting its non-invasive nature and ability to capture real-time brain activity. The study emphasizes the role of deep learning and machine learning in extracting meaningful features from EEG data for emotion classification. The proposed system uses a multi-modal approach, combining EEG signals with facial expressions to enhance the accuracy of emotion recognition. The study also addresses the challenges of EEG-based emotion recognition, including noise and artifacts, and proposes solutions such as optimizing electrode placement and using advanced filtering techniques. The paper concludes with the results of the proposed system, demonstrating its effectiveness in recognizing emotions with high accuracy. The study highlights the potential of EEG-based emotion recognition in various applications, including affective computing and mental health monitoring. The integration of EEG and facial expression analysis offers a promising avenue for future research in emotion recognition.