Cargando…

Visual attention prediction improves performance of autonomous drone racing agents

Humans race drones faster than neural networks trained for end-to-end autonomous flight. This may be related to the ability of human pilots to select task-relevant visual information effectively. This work investigates whether neural networks capable of imitating human eye gaze behavior and attentio...

Descripción completa

Detalles Bibliográficos
Autores principales: Pfeiffer, Christian, Wengeler, Simon, Loquercio, Antonio, Scaramuzza, Davide
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8887736/
https://www.ncbi.nlm.nih.gov/pubmed/35231038
http://dx.doi.org/10.1371/journal.pone.0264471
_version_ 1784660969713041408
author Pfeiffer, Christian
Wengeler, Simon
Loquercio, Antonio
Scaramuzza, Davide
author_facet Pfeiffer, Christian
Wengeler, Simon
Loquercio, Antonio
Scaramuzza, Davide
author_sort Pfeiffer, Christian
collection PubMed
description Humans race drones faster than neural networks trained for end-to-end autonomous flight. This may be related to the ability of human pilots to select task-relevant visual information effectively. This work investigates whether neural networks capable of imitating human eye gaze behavior and attention can improve neural networks’ performance for the challenging task of vision-based autonomous drone racing. We hypothesize that gaze-based attention prediction can be an efficient mechanism for visual information selection and decision making in a simulator-based drone racing task. We test this hypothesis using eye gaze and flight trajectory data from 18 human drone pilots to train a visual attention prediction model. We then use this visual attention prediction model to train an end-to-end controller for vision-based autonomous drone racing using imitation learning. We compare the drone racing performance of the attention-prediction controller to those using raw image inputs and image-based abstractions (i.e., feature tracks). Comparing success rates for completing a challenging race track by autonomous flight, our results show that the attention-prediction based controller (88% success rate) outperforms the RGB-image (61% success rate) and feature-tracks (55% success rate) controller baselines. Furthermore, visual attention-prediction and feature-track based models showed better generalization performance than image-based models when evaluated on hold-out reference trajectories. Our results demonstrate that human visual attention prediction improves the performance of autonomous vision-based drone racing agents and provides an essential step towards vision-based, fast, and agile autonomous flight that eventually can reach and even exceed human performances.
format Online
Article
Text
id pubmed-8887736
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-88877362022-03-02 Visual attention prediction improves performance of autonomous drone racing agents Pfeiffer, Christian Wengeler, Simon Loquercio, Antonio Scaramuzza, Davide PLoS One Research Article Humans race drones faster than neural networks trained for end-to-end autonomous flight. This may be related to the ability of human pilots to select task-relevant visual information effectively. This work investigates whether neural networks capable of imitating human eye gaze behavior and attention can improve neural networks’ performance for the challenging task of vision-based autonomous drone racing. We hypothesize that gaze-based attention prediction can be an efficient mechanism for visual information selection and decision making in a simulator-based drone racing task. We test this hypothesis using eye gaze and flight trajectory data from 18 human drone pilots to train a visual attention prediction model. We then use this visual attention prediction model to train an end-to-end controller for vision-based autonomous drone racing using imitation learning. We compare the drone racing performance of the attention-prediction controller to those using raw image inputs and image-based abstractions (i.e., feature tracks). Comparing success rates for completing a challenging race track by autonomous flight, our results show that the attention-prediction based controller (88% success rate) outperforms the RGB-image (61% success rate) and feature-tracks (55% success rate) controller baselines. Furthermore, visual attention-prediction and feature-track based models showed better generalization performance than image-based models when evaluated on hold-out reference trajectories. Our results demonstrate that human visual attention prediction improves the performance of autonomous vision-based drone racing agents and provides an essential step towards vision-based, fast, and agile autonomous flight that eventually can reach and even exceed human performances. Public Library of Science 2022-03-01 /pmc/articles/PMC8887736/ /pubmed/35231038 http://dx.doi.org/10.1371/journal.pone.0264471 Text en © 2022 Pfeiffer et al https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
spellingShingle Research Article
Pfeiffer, Christian
Wengeler, Simon
Loquercio, Antonio
Scaramuzza, Davide
Visual attention prediction improves performance of autonomous drone racing agents
title Visual attention prediction improves performance of autonomous drone racing agents
title_full Visual attention prediction improves performance of autonomous drone racing agents
title_fullStr Visual attention prediction improves performance of autonomous drone racing agents
title_full_unstemmed Visual attention prediction improves performance of autonomous drone racing agents
title_short Visual attention prediction improves performance of autonomous drone racing agents
title_sort visual attention prediction improves performance of autonomous drone racing agents
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8887736/
https://www.ncbi.nlm.nih.gov/pubmed/35231038
http://dx.doi.org/10.1371/journal.pone.0264471
work_keys_str_mv AT pfeifferchristian visualattentionpredictionimprovesperformanceofautonomousdroneracingagents
AT wengelersimon visualattentionpredictionimprovesperformanceofautonomousdroneracingagents
AT loquercioantonio visualattentionpredictionimprovesperformanceofautonomousdroneracingagents
AT scaramuzzadavide visualattentionpredictionimprovesperformanceofautonomousdroneracingagents