Cargando…
Illuminating the Neural Landscape of Pilot Mental States: A Convolutional Neural Network Approach with Shapley Additive Explanations Interpretability
Predicting pilots’ mental states is a critical challenge in aviation safety and performance, with electroencephalogram data offering a promising avenue for detection. However, the interpretability of machine learning and deep learning models, which are often used for such tasks, remains a significan...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10674947/ https://www.ncbi.nlm.nih.gov/pubmed/38005440 http://dx.doi.org/10.3390/s23229052 |
_version_ | 1785140948312784896 |
---|---|
author | Alreshidi, Ibrahim Bisandu, Desmond Moulitsas, Irene |
author_facet | Alreshidi, Ibrahim Bisandu, Desmond Moulitsas, Irene |
author_sort | Alreshidi, Ibrahim |
collection | PubMed |
description | Predicting pilots’ mental states is a critical challenge in aviation safety and performance, with electroencephalogram data offering a promising avenue for detection. However, the interpretability of machine learning and deep learning models, which are often used for such tasks, remains a significant issue. This study aims to address these challenges by developing an interpretable model to detect four mental states—channelised attention, diverted attention, startle/surprise, and normal state—in pilots using EEG data. The methodology involves training a convolutional neural network on power spectral density features of EEG data from 17 pilots. The model’s interpretability is enhanced via the use of SHapley Additive exPlanations values, which identify the top 10 most influential features for each mental state. The results demonstrate high performance in all metrics, with an average accuracy of 96%, a precision of 96%, a recall of 94%, and an F1 score of 95%. An examination of the effects of mental states on EEG frequency bands further elucidates the neural mechanisms underlying these states. The innovative nature of this study lies in its combination of high-performance model development, improved interpretability, and in-depth analysis of the neural correlates of mental states. This approach not only addresses the critical need for effective and interpretable mental state detection in aviation but also contributes to our understanding of the neural underpinnings of these states. This study thus represents a significant advancement in the field of EEG-based mental state detection. |
format | Online Article Text |
id | pubmed-10674947 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-106749472023-11-08 Illuminating the Neural Landscape of Pilot Mental States: A Convolutional Neural Network Approach with Shapley Additive Explanations Interpretability Alreshidi, Ibrahim Bisandu, Desmond Moulitsas, Irene Sensors (Basel) Article Predicting pilots’ mental states is a critical challenge in aviation safety and performance, with electroencephalogram data offering a promising avenue for detection. However, the interpretability of machine learning and deep learning models, which are often used for such tasks, remains a significant issue. This study aims to address these challenges by developing an interpretable model to detect four mental states—channelised attention, diverted attention, startle/surprise, and normal state—in pilots using EEG data. The methodology involves training a convolutional neural network on power spectral density features of EEG data from 17 pilots. The model’s interpretability is enhanced via the use of SHapley Additive exPlanations values, which identify the top 10 most influential features for each mental state. The results demonstrate high performance in all metrics, with an average accuracy of 96%, a precision of 96%, a recall of 94%, and an F1 score of 95%. An examination of the effects of mental states on EEG frequency bands further elucidates the neural mechanisms underlying these states. The innovative nature of this study lies in its combination of high-performance model development, improved interpretability, and in-depth analysis of the neural correlates of mental states. This approach not only addresses the critical need for effective and interpretable mental state detection in aviation but also contributes to our understanding of the neural underpinnings of these states. This study thus represents a significant advancement in the field of EEG-based mental state detection. MDPI 2023-11-08 /pmc/articles/PMC10674947/ /pubmed/38005440 http://dx.doi.org/10.3390/s23229052 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Alreshidi, Ibrahim Bisandu, Desmond Moulitsas, Irene Illuminating the Neural Landscape of Pilot Mental States: A Convolutional Neural Network Approach with Shapley Additive Explanations Interpretability |
title | Illuminating the Neural Landscape of Pilot Mental States: A Convolutional Neural Network Approach with Shapley Additive Explanations Interpretability |
title_full | Illuminating the Neural Landscape of Pilot Mental States: A Convolutional Neural Network Approach with Shapley Additive Explanations Interpretability |
title_fullStr | Illuminating the Neural Landscape of Pilot Mental States: A Convolutional Neural Network Approach with Shapley Additive Explanations Interpretability |
title_full_unstemmed | Illuminating the Neural Landscape of Pilot Mental States: A Convolutional Neural Network Approach with Shapley Additive Explanations Interpretability |
title_short | Illuminating the Neural Landscape of Pilot Mental States: A Convolutional Neural Network Approach with Shapley Additive Explanations Interpretability |
title_sort | illuminating the neural landscape of pilot mental states: a convolutional neural network approach with shapley additive explanations interpretability |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10674947/ https://www.ncbi.nlm.nih.gov/pubmed/38005440 http://dx.doi.org/10.3390/s23229052 |
work_keys_str_mv | AT alreshidiibrahim illuminatingtheneurallandscapeofpilotmentalstatesaconvolutionalneuralnetworkapproachwithshapleyadditiveexplanationsinterpretability AT bisandudesmond illuminatingtheneurallandscapeofpilotmentalstatesaconvolutionalneuralnetworkapproachwithshapleyadditiveexplanationsinterpretability AT moulitsasirene illuminatingtheneurallandscapeofpilotmentalstatesaconvolutionalneuralnetworkapproachwithshapleyadditiveexplanationsinterpretability |