Cargando…
(Mis-)use of standard Autopilot and Full Self-Driving (FSD) Beta: Results from interviews with users of Tesla's FSD Beta
Tesla's Full Self-Driving Beta (FSD) program introduces technology that extends the operational design domain of standard Autopilot from highways to urban roads. This research conducted 103 in-depth semi-structured interviews with users of Tesla's FSD Beta and standard Autopilot to evaluat...
Autores principales: | , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9996345/ https://www.ncbi.nlm.nih.gov/pubmed/36910772 http://dx.doi.org/10.3389/fpsyg.2023.1101520 |
_version_ | 1784903024610639872 |
---|---|
author | Nordhoff, Sina Lee, John D. Calvert, Simeon C. Berge, Siri Hagenzieker, Marjan Happee, Riender |
author_facet | Nordhoff, Sina Lee, John D. Calvert, Simeon C. Berge, Siri Hagenzieker, Marjan Happee, Riender |
author_sort | Nordhoff, Sina |
collection | PubMed |
description | Tesla's Full Self-Driving Beta (FSD) program introduces technology that extends the operational design domain of standard Autopilot from highways to urban roads. This research conducted 103 in-depth semi-structured interviews with users of Tesla's FSD Beta and standard Autopilot to evaluate the impact on user behavior and perception. It was found that drivers became complacent over time with Autopilot engaged, failing to monitor the system, and engaging in safety-critical behaviors, such as hands-free driving, enabled by weights placed on the steering wheel, mind wandering, or sleeping behind the wheel. Drivers' movement of eyes, hands, and feet became more relaxed with experience with Autopilot engaged. FSD Beta required constant supervision as unfinished technology, which increased driver stress and mental and physical workload as drivers had to be constantly prepared for unsafe system behavior (doing the wrong thing at the worst time). The hands-on wheel check was not considered as being necessarily effective in driver monitoring and guaranteeing safe use. Drivers adapt to automation over time, engaging in potentially dangerous behaviors. Some behavior seems to be a knowing violation of intended use (e.g., weighting the steering wheel), and other behavior reflects a misunderstanding or lack of experience (e.g., using Autopilot on roads not designed for). As unfinished Beta technology, FSD Beta can introduce new forms of stress and can be inherently unsafe. We recommend future research to investigate to what extent these behavioral changes affect accident risk and can be alleviated through driver state monitoring and assistance. |
format | Online Article Text |
id | pubmed-9996345 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-99963452023-03-10 (Mis-)use of standard Autopilot and Full Self-Driving (FSD) Beta: Results from interviews with users of Tesla's FSD Beta Nordhoff, Sina Lee, John D. Calvert, Simeon C. Berge, Siri Hagenzieker, Marjan Happee, Riender Front Psychol Psychology Tesla's Full Self-Driving Beta (FSD) program introduces technology that extends the operational design domain of standard Autopilot from highways to urban roads. This research conducted 103 in-depth semi-structured interviews with users of Tesla's FSD Beta and standard Autopilot to evaluate the impact on user behavior and perception. It was found that drivers became complacent over time with Autopilot engaged, failing to monitor the system, and engaging in safety-critical behaviors, such as hands-free driving, enabled by weights placed on the steering wheel, mind wandering, or sleeping behind the wheel. Drivers' movement of eyes, hands, and feet became more relaxed with experience with Autopilot engaged. FSD Beta required constant supervision as unfinished technology, which increased driver stress and mental and physical workload as drivers had to be constantly prepared for unsafe system behavior (doing the wrong thing at the worst time). The hands-on wheel check was not considered as being necessarily effective in driver monitoring and guaranteeing safe use. Drivers adapt to automation over time, engaging in potentially dangerous behaviors. Some behavior seems to be a knowing violation of intended use (e.g., weighting the steering wheel), and other behavior reflects a misunderstanding or lack of experience (e.g., using Autopilot on roads not designed for). As unfinished Beta technology, FSD Beta can introduce new forms of stress and can be inherently unsafe. We recommend future research to investigate to what extent these behavioral changes affect accident risk and can be alleviated through driver state monitoring and assistance. Frontiers Media S.A. 2023-02-23 /pmc/articles/PMC9996345/ /pubmed/36910772 http://dx.doi.org/10.3389/fpsyg.2023.1101520 Text en Copyright © 2023 Nordhoff, Lee, Calvert, Berge, Hagenzieker and Happee. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Psychology Nordhoff, Sina Lee, John D. Calvert, Simeon C. Berge, Siri Hagenzieker, Marjan Happee, Riender (Mis-)use of standard Autopilot and Full Self-Driving (FSD) Beta: Results from interviews with users of Tesla's FSD Beta |
title | (Mis-)use of standard Autopilot and Full Self-Driving (FSD) Beta: Results from interviews with users of Tesla's FSD Beta |
title_full | (Mis-)use of standard Autopilot and Full Self-Driving (FSD) Beta: Results from interviews with users of Tesla's FSD Beta |
title_fullStr | (Mis-)use of standard Autopilot and Full Self-Driving (FSD) Beta: Results from interviews with users of Tesla's FSD Beta |
title_full_unstemmed | (Mis-)use of standard Autopilot and Full Self-Driving (FSD) Beta: Results from interviews with users of Tesla's FSD Beta |
title_short | (Mis-)use of standard Autopilot and Full Self-Driving (FSD) Beta: Results from interviews with users of Tesla's FSD Beta |
title_sort | (mis-)use of standard autopilot and full self-driving (fsd) beta: results from interviews with users of tesla's fsd beta |
topic | Psychology |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9996345/ https://www.ncbi.nlm.nih.gov/pubmed/36910772 http://dx.doi.org/10.3389/fpsyg.2023.1101520 |
work_keys_str_mv | AT nordhoffsina misuseofstandardautopilotandfullselfdrivingfsdbetaresultsfrominterviewswithusersofteslasfsdbeta AT leejohnd misuseofstandardautopilotandfullselfdrivingfsdbetaresultsfrominterviewswithusersofteslasfsdbeta AT calvertsimeonc misuseofstandardautopilotandfullselfdrivingfsdbetaresultsfrominterviewswithusersofteslasfsdbeta AT bergesiri misuseofstandardautopilotandfullselfdrivingfsdbetaresultsfrominterviewswithusersofteslasfsdbeta AT hagenziekermarjan misuseofstandardautopilotandfullselfdrivingfsdbetaresultsfrominterviewswithusersofteslasfsdbeta AT happeeriender misuseofstandardautopilotandfullselfdrivingfsdbetaresultsfrominterviewswithusersofteslasfsdbeta |