Cargando…

Hyper-Parameter Optimization of Stacked Asymmetric Auto-Encoders for Automatic Personality Traits Perception

In this work, a method for automatic hyper-parameter tuning of the stacked asymmetric auto-encoder is proposed. In previous work, the deep learning ability to extract personality perception from speech was shown, but hyper-parameter tuning was attained by trial-and-error, which is time-consuming and...

Descripción completa

Detalles Bibliográficos
Autores principales: Jalaeian Zaferani, Effat, Teshnehlab, Mohammad, Khodadadian, Amirreza, Heitzinger, Clemens, Vali, Mansour, Noii, Nima, Wick, Thomas
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9413006/
https://www.ncbi.nlm.nih.gov/pubmed/36015967
http://dx.doi.org/10.3390/s22166206
_version_ 1784775632610131968
author Jalaeian Zaferani, Effat
Teshnehlab, Mohammad
Khodadadian, Amirreza
Heitzinger, Clemens
Vali, Mansour
Noii, Nima
Wick, Thomas
author_facet Jalaeian Zaferani, Effat
Teshnehlab, Mohammad
Khodadadian, Amirreza
Heitzinger, Clemens
Vali, Mansour
Noii, Nima
Wick, Thomas
author_sort Jalaeian Zaferani, Effat
collection PubMed
description In this work, a method for automatic hyper-parameter tuning of the stacked asymmetric auto-encoder is proposed. In previous work, the deep learning ability to extract personality perception from speech was shown, but hyper-parameter tuning was attained by trial-and-error, which is time-consuming and requires machine learning knowledge. Therefore, obtaining hyper-parameter values is challenging and places limits on deep learning usage. To address this challenge, researchers have applied optimization methods. Although there were successes, the search space is very large due to the large number of deep learning hyper-parameters, which increases the probability of getting stuck in local optima. Researchers have also focused on improving global optimization methods. In this regard, we suggest a novel global optimization method based on the cultural algorithm, multi-island and the concept of parallelism to search this large space smartly. At first, we evaluated our method on three well-known optimization benchmarks and compared the results with recently published papers. Results indicate that the convergence of the proposed method speeds up due to the ability to escape from local optima, and the precision of the results improves dramatically. Afterward, we applied our method to optimize five hyper-parameters of an asymmetric auto-encoder for automatic personality perception. Since inappropriate hyper-parameters lead the network to over-fitting and under-fitting, we used a novel cost function to prevent over-fitting and under-fitting. As observed, the unweighted average recall (accuracy) was improved by 6.52% (9.54%) compared to our previous work and had remarkable outcomes compared to other published personality perception works.
format Online
Article
Text
id pubmed-9413006
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-94130062022-08-27 Hyper-Parameter Optimization of Stacked Asymmetric Auto-Encoders for Automatic Personality Traits Perception Jalaeian Zaferani, Effat Teshnehlab, Mohammad Khodadadian, Amirreza Heitzinger, Clemens Vali, Mansour Noii, Nima Wick, Thomas Sensors (Basel) Article In this work, a method for automatic hyper-parameter tuning of the stacked asymmetric auto-encoder is proposed. In previous work, the deep learning ability to extract personality perception from speech was shown, but hyper-parameter tuning was attained by trial-and-error, which is time-consuming and requires machine learning knowledge. Therefore, obtaining hyper-parameter values is challenging and places limits on deep learning usage. To address this challenge, researchers have applied optimization methods. Although there were successes, the search space is very large due to the large number of deep learning hyper-parameters, which increases the probability of getting stuck in local optima. Researchers have also focused on improving global optimization methods. In this regard, we suggest a novel global optimization method based on the cultural algorithm, multi-island and the concept of parallelism to search this large space smartly. At first, we evaluated our method on three well-known optimization benchmarks and compared the results with recently published papers. Results indicate that the convergence of the proposed method speeds up due to the ability to escape from local optima, and the precision of the results improves dramatically. Afterward, we applied our method to optimize five hyper-parameters of an asymmetric auto-encoder for automatic personality perception. Since inappropriate hyper-parameters lead the network to over-fitting and under-fitting, we used a novel cost function to prevent over-fitting and under-fitting. As observed, the unweighted average recall (accuracy) was improved by 6.52% (9.54%) compared to our previous work and had remarkable outcomes compared to other published personality perception works. MDPI 2022-08-18 /pmc/articles/PMC9413006/ /pubmed/36015967 http://dx.doi.org/10.3390/s22166206 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Jalaeian Zaferani, Effat
Teshnehlab, Mohammad
Khodadadian, Amirreza
Heitzinger, Clemens
Vali, Mansour
Noii, Nima
Wick, Thomas
Hyper-Parameter Optimization of Stacked Asymmetric Auto-Encoders for Automatic Personality Traits Perception
title Hyper-Parameter Optimization of Stacked Asymmetric Auto-Encoders for Automatic Personality Traits Perception
title_full Hyper-Parameter Optimization of Stacked Asymmetric Auto-Encoders for Automatic Personality Traits Perception
title_fullStr Hyper-Parameter Optimization of Stacked Asymmetric Auto-Encoders for Automatic Personality Traits Perception
title_full_unstemmed Hyper-Parameter Optimization of Stacked Asymmetric Auto-Encoders for Automatic Personality Traits Perception
title_short Hyper-Parameter Optimization of Stacked Asymmetric Auto-Encoders for Automatic Personality Traits Perception
title_sort hyper-parameter optimization of stacked asymmetric auto-encoders for automatic personality traits perception
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9413006/
https://www.ncbi.nlm.nih.gov/pubmed/36015967
http://dx.doi.org/10.3390/s22166206
work_keys_str_mv AT jalaeianzaferanieffat hyperparameteroptimizationofstackedasymmetricautoencodersforautomaticpersonalitytraitsperception
AT teshnehlabmohammad hyperparameteroptimizationofstackedasymmetricautoencodersforautomaticpersonalitytraitsperception
AT khodadadianamirreza hyperparameteroptimizationofstackedasymmetricautoencodersforautomaticpersonalitytraitsperception
AT heitzingerclemens hyperparameteroptimizationofstackedasymmetricautoencodersforautomaticpersonalitytraitsperception
AT valimansour hyperparameteroptimizationofstackedasymmetricautoencodersforautomaticpersonalitytraitsperception
AT noiinima hyperparameteroptimizationofstackedasymmetricautoencodersforautomaticpersonalitytraitsperception
AT wickthomas hyperparameteroptimizationofstackedasymmetricautoencodersforautomaticpersonalitytraitsperception