Cargando…
O(2)A: One-Shot Observational Learning with Action Vectors
We present O(2)A, a novel method for learning to perform robotic manipulation tasks from a single (one-shot) third-person demonstration video. To our knowledge, it is the first time this has been done for a single demonstration. The key novelty lies in pre-training a feature extractor for creating a...
Autores principales: | Pauly, Leo, Agboh , Wisdom C., Hogg , David C., Fuentes , Raul |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8367442/ https://www.ncbi.nlm.nih.gov/pubmed/34409071 http://dx.doi.org/10.3389/frobt.2021.686368 |
Ejemplares similares
-
One-shot learning for autonomous aerial manipulation
por: Zito, Claudio, et al.
Publicado: (2022) -
AROS: Affordance Recognition with One-Shot Human Stances
por: Pacheco-Ortega, Abel, et al.
Publicado: (2023) -
Zero-shot model-free learning of periodic movements for a bio-inspired soft-robotic arm
por: Oikonomou, Paris, et al.
Publicado: (2023) -
Few-Shot Induction of Generalized Logical Concepts via Human Guidance
por: Das, Mayukh, et al.
Publicado: (2020) -
Integrated Cognitive Architecture for Robot Learning of Action and Language
por: Miyazawa, Kazuki, et al.
Publicado: (2019)