Cargando…
Speech emotion analysis using convolutional neural network (CNN) and gamma classifier-based error correcting output codes (ECOC)
Speech emotion analysis is one of the most basic requirements for the evolution of Artificial Intelligence (AI) in the field of human–machine interaction. Accurate emotion recognition in speech can be effective in applications such as online support, lie detection systems and customer feedback analy...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10663497/ https://www.ncbi.nlm.nih.gov/pubmed/37989782 http://dx.doi.org/10.1038/s41598-023-47118-4 |
Sumario: | Speech emotion analysis is one of the most basic requirements for the evolution of Artificial Intelligence (AI) in the field of human–machine interaction. Accurate emotion recognition in speech can be effective in applications such as online support, lie detection systems and customer feedback analysis. However, the existing techniques for this field have not yet met sufficient development. This paper presents a new method to improve the performance of emotion analysis in speech. The proposed method includes the following steps: pre-processing, feature description, feature extraction, and classification. The initial description of speech features in the proposed method is done by using the combination of spectro-temporal modulation (STM) and entropy features. Also, a Convolutional Neural Network (CNN) is utilized to reduce the dimensions of these features and extract the features of each signal. Finally, the combination of gamma classifier (GC) and Error-Correcting Output Codes (ECOC) is applied to classify features and extract emotions in speech. The performance of the proposed method has been evaluated using two datasets, Berlin and ShEMO. The results show that the proposed method can recognize speech emotions in the Berlin and ShEMO datasets with an average accuracy of 93.33 and 85.73%, respectively, which is at least 6.67% better than compared methods. |
---|