Cargando…
Automatic Fire Detection and Notification System Based on Improved YOLOv4 for the Blind and Visually Impaired
The growing aging population suffers from high levels of vision and cognitive impairment, often resulting in a loss of independence. Such individuals must perform crucial everyday tasks such as cooking and heating with systems and devices designed for visually unimpaired individuals, which do not ta...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9103130/ https://www.ncbi.nlm.nih.gov/pubmed/35590996 http://dx.doi.org/10.3390/s22093307 |
_version_ | 1784707488512212992 |
---|---|
author | Mukhiddinov, Mukhriddin Abdusalomov, Akmalbek Bobomirzaevich Cho, Jinsoo |
author_facet | Mukhiddinov, Mukhriddin Abdusalomov, Akmalbek Bobomirzaevich Cho, Jinsoo |
author_sort | Mukhiddinov, Mukhriddin |
collection | PubMed |
description | The growing aging population suffers from high levels of vision and cognitive impairment, often resulting in a loss of independence. Such individuals must perform crucial everyday tasks such as cooking and heating with systems and devices designed for visually unimpaired individuals, which do not take into account the needs of persons with visual and cognitive impairment. Thus, the visually impaired persons using them run risks related to smoke and fire. In this paper, we propose a vision-based fire detection and notification system using smart glasses and deep learning models for blind and visually impaired (BVI) people. The system enables early detection of fires in indoor environments. To perform real-time fire detection and notification, the proposed system uses image brightness and a new convolutional neural network employing an improved YOLOv4 model with a convolutional block attention module. The h-swish activation function is used to reduce the running time and increase the robustness of YOLOv4. We adapt our previously developed smart glasses system to capture images and inform BVI people about fires and other surrounding objects through auditory messages. We create a large fire image dataset with indoor fire scenes to accurately detect fires. Furthermore, we develop an object mapping approach to provide BVI people with complete information about surrounding objects and to differentiate between hazardous and nonhazardous fires. The proposed system shows an improvement over other well-known approaches in all fire detection metrics such as precision, recall, and average precision. |
format | Online Article Text |
id | pubmed-9103130 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-91031302022-05-14 Automatic Fire Detection and Notification System Based on Improved YOLOv4 for the Blind and Visually Impaired Mukhiddinov, Mukhriddin Abdusalomov, Akmalbek Bobomirzaevich Cho, Jinsoo Sensors (Basel) Article The growing aging population suffers from high levels of vision and cognitive impairment, often resulting in a loss of independence. Such individuals must perform crucial everyday tasks such as cooking and heating with systems and devices designed for visually unimpaired individuals, which do not take into account the needs of persons with visual and cognitive impairment. Thus, the visually impaired persons using them run risks related to smoke and fire. In this paper, we propose a vision-based fire detection and notification system using smart glasses and deep learning models for blind and visually impaired (BVI) people. The system enables early detection of fires in indoor environments. To perform real-time fire detection and notification, the proposed system uses image brightness and a new convolutional neural network employing an improved YOLOv4 model with a convolutional block attention module. The h-swish activation function is used to reduce the running time and increase the robustness of YOLOv4. We adapt our previously developed smart glasses system to capture images and inform BVI people about fires and other surrounding objects through auditory messages. We create a large fire image dataset with indoor fire scenes to accurately detect fires. Furthermore, we develop an object mapping approach to provide BVI people with complete information about surrounding objects and to differentiate between hazardous and nonhazardous fires. The proposed system shows an improvement over other well-known approaches in all fire detection metrics such as precision, recall, and average precision. MDPI 2022-04-26 /pmc/articles/PMC9103130/ /pubmed/35590996 http://dx.doi.org/10.3390/s22093307 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Mukhiddinov, Mukhriddin Abdusalomov, Akmalbek Bobomirzaevich Cho, Jinsoo Automatic Fire Detection and Notification System Based on Improved YOLOv4 for the Blind and Visually Impaired |
title | Automatic Fire Detection and Notification System Based on Improved YOLOv4 for the Blind and Visually Impaired |
title_full | Automatic Fire Detection and Notification System Based on Improved YOLOv4 for the Blind and Visually Impaired |
title_fullStr | Automatic Fire Detection and Notification System Based on Improved YOLOv4 for the Blind and Visually Impaired |
title_full_unstemmed | Automatic Fire Detection and Notification System Based on Improved YOLOv4 for the Blind and Visually Impaired |
title_short | Automatic Fire Detection and Notification System Based on Improved YOLOv4 for the Blind and Visually Impaired |
title_sort | automatic fire detection and notification system based on improved yolov4 for the blind and visually impaired |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9103130/ https://www.ncbi.nlm.nih.gov/pubmed/35590996 http://dx.doi.org/10.3390/s22093307 |
work_keys_str_mv | AT mukhiddinovmukhriddin automaticfiredetectionandnotificationsystembasedonimprovedyolov4fortheblindandvisuallyimpaired AT abdusalomovakmalbekbobomirzaevich automaticfiredetectionandnotificationsystembasedonimprovedyolov4fortheblindandvisuallyimpaired AT chojinsoo automaticfiredetectionandnotificationsystembasedonimprovedyolov4fortheblindandvisuallyimpaired |