Cargando…
Environmental Noise Classification Using Convolutional Neural Networks with Input Transform for Hearing Aids
Hearing aids are essential for people with hearing loss, and noise estimation and classification are some of the most important technologies used in devices. This paper presents an environmental noise classification algorithm for hearing aids that uses convolutional neural networks (CNNs) and image...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7178286/ https://www.ncbi.nlm.nih.gov/pubmed/32230966 http://dx.doi.org/10.3390/ijerph17072270 |
Sumario: | Hearing aids are essential for people with hearing loss, and noise estimation and classification are some of the most important technologies used in devices. This paper presents an environmental noise classification algorithm for hearing aids that uses convolutional neural networks (CNNs) and image signals transformed from sound signals. The algorithm was developed using the data of ten types of noise acquired from living environments where such noises occur. Spectrogram images transformed from sound data are used as the input of the CNNs after processing of the images by a sharpening mask and median filter. The classification results of the proposed algorithm were compared with those of other noise classification methods. A maximum correct classification accuracy of 99.25% was achieved by the proposed algorithm for a spectrogram time length of 1 s, with the correct classification accuracy decreasing with increasing spectrogram time length up to 8 s. For a spectrogram time length of 8 s and using the sharpening mask and median filter, the classification accuracy was 98.73%, which is comparable with the 98.79% achieved by the conventional method for a time length of 1 s. The proposed hearing aid noise classification algorithm thus offers less computational complexity without compromising on performance. |
---|