Cargando…
Feature Pyramid Networks and Long Short-Term Memory for EEG Feature Map-Based Emotion Recognition
The original EEG data collected are the 1D sequence, which ignores spatial topology information; Feature Pyramid Networks (FPN) is better at small dimension target detection and insufficient feature extraction in the scale transformation than CNN. We propose a method of FPN and Long Short-Term Memor...
Autores principales: | , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9921369/ https://www.ncbi.nlm.nih.gov/pubmed/36772661 http://dx.doi.org/10.3390/s23031622 |
Sumario: | The original EEG data collected are the 1D sequence, which ignores spatial topology information; Feature Pyramid Networks (FPN) is better at small dimension target detection and insufficient feature extraction in the scale transformation than CNN. We propose a method of FPN and Long Short-Term Memory (FPN-LSTM) for EEG feature map-based emotion recognition. According to the spatial arrangement of brain electrodes, the Azimuth Equidistant Projection (AEP) is employed to generate the 2D EEG map, which preserves the spatial topology information; then, the average power, variance power, and standard deviation power of three frequency bands ([Formula: see text] , [Formula: see text] , and [Formula: see text]) are extracted as the feature data for the EEG feature map. BiCubic interpolation is employed to interpolate the blank pixel among the electrodes; the three frequency bands EEG feature maps are used as the G, R, and B channels to generate EEG feature maps. Then, we put forward the idea of distributing the weight proportion for channels, assign large weight to strong emotion correlation channels (AF3, F3, F7, FC5, and T7), and assign small weight to the others; the proposed FPN-LSTM is used on EEG feature maps for emotion recognition. The experiment results show that the proposed method can achieve Value and Arousal recognition rates of 90.05% and 90.84%, respectively. |
---|