Cargando…
Learn on the Fly
In this study, we explore the biologically-inspired Learn-On-The-Fly (LOTF) method that actively learns and discovers patterns with improvisation and sensory intelligence, including pheromone trails, structure from motion, sensory fusion, sensory inhibition, and spontaneous alternation. LOTF is rela...
Autor principal: | |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7325289/ http://dx.doi.org/10.1007/978-3-030-51758-8_8 |
_version_ | 1783552122075217920 |
---|---|
author | Cai, Yang |
author_facet | Cai, Yang |
author_sort | Cai, Yang |
collection | PubMed |
description | In this study, we explore the biologically-inspired Learn-On-The-Fly (LOTF) method that actively learns and discovers patterns with improvisation and sensory intelligence, including pheromone trails, structure from motion, sensory fusion, sensory inhibition, and spontaneous alternation. LOTF is related to classic online modeling and adaptive modeling methods. However, it aims to solve more comprehensive, ill-structured problems such as human activity recognition from a drone video in a disastrous environment. It helps to build explainable AI models that enable human-machine teaming with visual representation, visual reasoning, and machine vision. It is anticipated that LOTF would have an impact on Artificial Intelligence, video analytics for searching and tracking survivors’ activities for humanitarian assistance and disaster relief (HADR), field augmented reality, and field robotic swarms. |
format | Online Article Text |
id | pubmed-7325289 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2020 |
record_format | MEDLINE/PubMed |
spelling | pubmed-73252892020-06-30 Learn on the Fly Cai, Yang Advances in Human Factors in Robots, Drones and Unmanned Systems Article In this study, we explore the biologically-inspired Learn-On-The-Fly (LOTF) method that actively learns and discovers patterns with improvisation and sensory intelligence, including pheromone trails, structure from motion, sensory fusion, sensory inhibition, and spontaneous alternation. LOTF is related to classic online modeling and adaptive modeling methods. However, it aims to solve more comprehensive, ill-structured problems such as human activity recognition from a drone video in a disastrous environment. It helps to build explainable AI models that enable human-machine teaming with visual representation, visual reasoning, and machine vision. It is anticipated that LOTF would have an impact on Artificial Intelligence, video analytics for searching and tracking survivors’ activities for humanitarian assistance and disaster relief (HADR), field augmented reality, and field robotic swarms. 2020-05-31 /pmc/articles/PMC7325289/ http://dx.doi.org/10.1007/978-3-030-51758-8_8 Text en © The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Switzerland AG 2021 This article is made available via the PMC Open Access Subset for unrestricted research re-use and secondary analysis in any form or by any means with acknowledgement of the original source. These permissions are granted for the duration of the World Health Organization (WHO) declaration of COVID-19 as a global pandemic. |
spellingShingle | Article Cai, Yang Learn on the Fly |
title | Learn on the Fly |
title_full | Learn on the Fly |
title_fullStr | Learn on the Fly |
title_full_unstemmed | Learn on the Fly |
title_short | Learn on the Fly |
title_sort | learn on the fly |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7325289/ http://dx.doi.org/10.1007/978-3-030-51758-8_8 |
work_keys_str_mv | AT caiyang learnonthefly |