Cargando…

Attention-Aware Patch-Based CNN for Blind 360-Degree Image Quality Assessment

An attention-aware patch-based deep-learning model for a blind 360-degree image quality assessment (360-IQA) is introduced in this paper. It employs spatial attention mechanisms to focus on spatially significant features, in addition to short skip connections to align them. A long skip connection is...

Descripción completa

Detalles Bibliográficos
Autores principales: Sendjasni, Abderrezzaq, Larabi, Mohamed-Chaker
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10647793/
https://www.ncbi.nlm.nih.gov/pubmed/37960376
http://dx.doi.org/10.3390/s23218676
_version_ 1785135191152394240
author Sendjasni, Abderrezzaq
Larabi, Mohamed-Chaker
author_facet Sendjasni, Abderrezzaq
Larabi, Mohamed-Chaker
author_sort Sendjasni, Abderrezzaq
collection PubMed
description An attention-aware patch-based deep-learning model for a blind 360-degree image quality assessment (360-IQA) is introduced in this paper. It employs spatial attention mechanisms to focus on spatially significant features, in addition to short skip connections to align them. A long skip connection is adopted to allow features from the earliest layers to be used at the final level. Patches are properly sampled on the sphere to correspond to the viewports displayed to the user using head-mounted displays. The sampling incorporates the relevance of patches by considering (i) the exploration behavior and (ii) a latitude-based selection. An adaptive strategy is applied to improve the pooling of local patch qualities to global image quality. This includes an outlier score rejection step relying on the standard deviation of the obtained scores to consider the agreement, as well as a saliency to weigh them based on their visual significance. Experiments on available 360-IQA databases show that our model outperforms the state of the art in terms of accuracy and generalization ability. This is valid for general deep-learning-based models, multichannel models, and natural scene statistic-based models. Furthermore, when compared to multichannel models, the computational complexity is significantly reduced. Finally, an extensive ablation study gives insights into the efficacy of each component of the proposed model.
format Online
Article
Text
id pubmed-10647793
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-106477932023-10-24 Attention-Aware Patch-Based CNN for Blind 360-Degree Image Quality Assessment Sendjasni, Abderrezzaq Larabi, Mohamed-Chaker Sensors (Basel) Article An attention-aware patch-based deep-learning model for a blind 360-degree image quality assessment (360-IQA) is introduced in this paper. It employs spatial attention mechanisms to focus on spatially significant features, in addition to short skip connections to align them. A long skip connection is adopted to allow features from the earliest layers to be used at the final level. Patches are properly sampled on the sphere to correspond to the viewports displayed to the user using head-mounted displays. The sampling incorporates the relevance of patches by considering (i) the exploration behavior and (ii) a latitude-based selection. An adaptive strategy is applied to improve the pooling of local patch qualities to global image quality. This includes an outlier score rejection step relying on the standard deviation of the obtained scores to consider the agreement, as well as a saliency to weigh them based on their visual significance. Experiments on available 360-IQA databases show that our model outperforms the state of the art in terms of accuracy and generalization ability. This is valid for general deep-learning-based models, multichannel models, and natural scene statistic-based models. Furthermore, when compared to multichannel models, the computational complexity is significantly reduced. Finally, an extensive ablation study gives insights into the efficacy of each component of the proposed model. MDPI 2023-10-24 /pmc/articles/PMC10647793/ /pubmed/37960376 http://dx.doi.org/10.3390/s23218676 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Sendjasni, Abderrezzaq
Larabi, Mohamed-Chaker
Attention-Aware Patch-Based CNN for Blind 360-Degree Image Quality Assessment
title Attention-Aware Patch-Based CNN for Blind 360-Degree Image Quality Assessment
title_full Attention-Aware Patch-Based CNN for Blind 360-Degree Image Quality Assessment
title_fullStr Attention-Aware Patch-Based CNN for Blind 360-Degree Image Quality Assessment
title_full_unstemmed Attention-Aware Patch-Based CNN for Blind 360-Degree Image Quality Assessment
title_short Attention-Aware Patch-Based CNN for Blind 360-Degree Image Quality Assessment
title_sort attention-aware patch-based cnn for blind 360-degree image quality assessment
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10647793/
https://www.ncbi.nlm.nih.gov/pubmed/37960376
http://dx.doi.org/10.3390/s23218676
work_keys_str_mv AT sendjasniabderrezzaq attentionawarepatchbasedcnnforblind360degreeimagequalityassessment
AT larabimohamedchaker attentionawarepatchbasedcnnforblind360degreeimagequalityassessment