Vafaei E, Rahatabad F N, Setarehdan S K, Azadfallah P. Feature Extraction With Stacked Autoencoders for EEG Channel Reduction in Emotion Recognition. BCN 2024; 15 (3) :393-402
URL:
http://bcn.iums.ac.ir/article-1-2646-en.html
1- Department of Biomedical Engineering, Faculty of Medical Sciences and Technologies, Science and Research Branch, Islamic Azad University, Tehran, Iran.
2- School of Electrical and Computer Engineering, University of Tehran, Tehran, Iran.
3- Faculty of Humanities, Tarbiat Modares University, Tehran, Iran.
Abstract:
Introduction: Emotion recognition by electroencephalogram (EEG) signals is one of the complex methods because the extraction and recognition of the features hidden in the signal are sophisticated and require a significant number of EEG channels. Presenting a method for feature analysis and an algorithm for reducing the number of EEG channels fulfills the need for research in this field.
Methods: Accordingly, this study investigates the possibility of utilizing deep learning to reduce the number of channels while maintaining the quality of the EEG signal. A stacked autoencoder network extracts optimal features for emotion classification in valence and arousal dimensions. Autoencoder networks can extract complex features to provide linear and non- linear features which are a good representative of the signal.
Results: The accuracy of a conventional emotion recognition classifier (support vector machine) using features extracted from SAEs was obtained at 75.7% for valence and 74.4% for arousal dimensions, respectively.
Conclusion: Further analysis also illustrates that valence dimension detection with reduced EEG channels has a different composition of EEG channels compared to the arousal dimension. In addition, the number of channels is reduced from 32 to 12, which is an excellent development for designing a small-size EEG device by applying these optimal features.
Type of Study:
Original |
Subject:
Cognitive Neuroscience Received: 2023/01/8 | Accepted: 2023/02/20 | Published: 2024/05/1