Cargando…

Smart Task Assistance in Mixed Reality for Astronauts

Mixed reality (MR) registers virtual information and real objects and is an effective way to supplement astronaut training. Spatial anchors are generally used to perform virtual–real fusion in static scenes but cannot handle movable objects. To address this issue, we propose a smart task assistance...

Descripción completa

Detalles Bibliográficos
Autores principales: Sun, Qingwei, Chen, Wei, Chao, Jiangang, Lin, Wanhong, Xu, Zhenying, Cao, Ruizhi
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10181572/
https://www.ncbi.nlm.nih.gov/pubmed/37177546
http://dx.doi.org/10.3390/s23094344
_version_ 1785041606214156288
author Sun, Qingwei
Chen, Wei
Chao, Jiangang
Lin, Wanhong
Xu, Zhenying
Cao, Ruizhi
author_facet Sun, Qingwei
Chen, Wei
Chao, Jiangang
Lin, Wanhong
Xu, Zhenying
Cao, Ruizhi
author_sort Sun, Qingwei
collection PubMed
description Mixed reality (MR) registers virtual information and real objects and is an effective way to supplement astronaut training. Spatial anchors are generally used to perform virtual–real fusion in static scenes but cannot handle movable objects. To address this issue, we propose a smart task assistance method based on object detection and point cloud alignment. Specifically, both fixed and movable objects are detected automatically. In parallel, poses are estimated with no dependence on preset spatial position information. Firstly, YOLOv5s is used to detect the object and segment the point cloud of the corresponding structure, called the partial point cloud. Then, an iterative closest point (ICP) algorithm between the partial point cloud and the template point cloud is used to calculate the object’s pose and execute the virtual–real fusion. The results demonstrate that the proposed method achieves automatic pose estimation for both fixed and movable objects without background information and preset spatial anchors. Most volunteers reported that our approach was practical, and it thus expands the application of astronaut training.
format Online
Article
Text
id pubmed-10181572
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-101815722023-05-13 Smart Task Assistance in Mixed Reality for Astronauts Sun, Qingwei Chen, Wei Chao, Jiangang Lin, Wanhong Xu, Zhenying Cao, Ruizhi Sensors (Basel) Article Mixed reality (MR) registers virtual information and real objects and is an effective way to supplement astronaut training. Spatial anchors are generally used to perform virtual–real fusion in static scenes but cannot handle movable objects. To address this issue, we propose a smart task assistance method based on object detection and point cloud alignment. Specifically, both fixed and movable objects are detected automatically. In parallel, poses are estimated with no dependence on preset spatial position information. Firstly, YOLOv5s is used to detect the object and segment the point cloud of the corresponding structure, called the partial point cloud. Then, an iterative closest point (ICP) algorithm between the partial point cloud and the template point cloud is used to calculate the object’s pose and execute the virtual–real fusion. The results demonstrate that the proposed method achieves automatic pose estimation for both fixed and movable objects without background information and preset spatial anchors. Most volunteers reported that our approach was practical, and it thus expands the application of astronaut training. MDPI 2023-04-27 /pmc/articles/PMC10181572/ /pubmed/37177546 http://dx.doi.org/10.3390/s23094344 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Sun, Qingwei
Chen, Wei
Chao, Jiangang
Lin, Wanhong
Xu, Zhenying
Cao, Ruizhi
Smart Task Assistance in Mixed Reality for Astronauts
title Smart Task Assistance in Mixed Reality for Astronauts
title_full Smart Task Assistance in Mixed Reality for Astronauts
title_fullStr Smart Task Assistance in Mixed Reality for Astronauts
title_full_unstemmed Smart Task Assistance in Mixed Reality for Astronauts
title_short Smart Task Assistance in Mixed Reality for Astronauts
title_sort smart task assistance in mixed reality for astronauts
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10181572/
https://www.ncbi.nlm.nih.gov/pubmed/37177546
http://dx.doi.org/10.3390/s23094344
work_keys_str_mv AT sunqingwei smarttaskassistanceinmixedrealityforastronauts
AT chenwei smarttaskassistanceinmixedrealityforastronauts
AT chaojiangang smarttaskassistanceinmixedrealityforastronauts
AT linwanhong smarttaskassistanceinmixedrealityforastronauts
AT xuzhenying smarttaskassistanceinmixedrealityforastronauts
AT caoruizhi smarttaskassistanceinmixedrealityforastronauts