| dc.description.abstract |
Emotion recognition from electroencephalography (EEG) signals has attracted
considerable attention in recent years, as it offers a non-invasive and objective means
of assessing emotional processing. In this paper, we propose a novel hybrid model that
combines the power of three-dimensional Convolutional Neural Networks (3D-CNN)
and Recurrent Neural Networks (RNN) for emotion recognition from EEG signals.
Our model extracts spatiotemporal features using a 3D-CNN and captures temporal
dependencies using an RNN, achieving state-of-the-art performance in recognizing
emotions such as happiness, sadness, anger, and fear. Emotion recognition is a crit ical aspect of human communication and interaction, and its accurate identification
has significant implications in various fields, including psychology, neuroscience, and
affective computing.
We present a novel method for emotional classification, which is evaluated on the
DEAP dataset. Our method achieves high accuracy in binary classification of valence
and arousal, with scores of 95.45%, 96.63%, and 97.43%, respectively. In the four-class
classification task, our models perform with similar accuracy. However, we note that
only four emotion spaces are found with binary classification, whereas 8-class classifi cation is more precise. Therefore, we extend our method to 8-class classification and
achieve a promising accuracy of 94.83%.To extract features from the EEG, physio logical, and video signals in the DEAP dataset, we use Fast Fourier Transformation
(FFT). We employ the same 3D-CNN+RNN architecture for all four classification
models, which contributes to the consistency of our results.Overall, our experiments
demonstrate the effectiveness of our proposed method for emotional classification, and
provide evidence that it can perform well in both binary and multi-class classification
scenarios. |
en_US |