Cargando…

A novel motionless calibration method for augmented reality surgery navigation system based on optical tracker

Augmented reality (AR) surgery navigation systems display the pre-operation planned virtual model at the accurate position in the real surgical scene to assist the operation. Accurate calibration of the mapping relationship between the virtual coordinate and the real world is the key to the virtual-...

Descripción completa

Detalles Bibliográficos
Autores principales: Wan, Xinjun, Shen, Lizhengyi, Fang, Zhiqiang, Dong, Shao, Zhang, Shilei, Lin, Chengzhong
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Elsevier 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9801086/
https://www.ncbi.nlm.nih.gov/pubmed/36590529
http://dx.doi.org/10.1016/j.heliyon.2022.e12115
_version_ 1784861424244228096
author Wan, Xinjun
Shen, Lizhengyi
Fang, Zhiqiang
Dong, Shao
Zhang, Shilei
Lin, Chengzhong
author_facet Wan, Xinjun
Shen, Lizhengyi
Fang, Zhiqiang
Dong, Shao
Zhang, Shilei
Lin, Chengzhong
author_sort Wan, Xinjun
collection PubMed
description Augmented reality (AR) surgery navigation systems display the pre-operation planned virtual model at the accurate position in the real surgical scene to assist the operation. Accurate calibration of the mapping relationship between the virtual coordinate and the real world is the key to the virtual-real fusion effect. Former calibration methods require the doctor user to conduct complex manual procedures before usage. This paper introduces a novel motionless virtual-real calibration method. The method only requires to take a mixed reality image containing both virtual and real marker balls using the built-in forward camera of the AR glasses. The mapping relationship between the virtual and real spaces is calculated by using the camera coordinate system as a transformation medium. The composition and working process of the AR navigation system is introduced, and then the mathematical principle of the calibration is designed. The feasibility of the proposed calibration scheme is verified with a verification experiment, and the average registration accuracy of the scheme is around 5.80mm, which is of same level of formerly reported methods. The proposed method is convenient and rapid to implement, and the calibration accuracy is not dependent on the user experience. Further, it can potentially realize the real-time update of the registration transformation matrix, which can improve the AR fusion accuracy when the AR glasses moves. This motionless calibration method has great potential to be applied in future clinical navigation research.
format Online
Article
Text
id pubmed-9801086
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Elsevier
record_format MEDLINE/PubMed
spelling pubmed-98010862022-12-31 A novel motionless calibration method for augmented reality surgery navigation system based on optical tracker Wan, Xinjun Shen, Lizhengyi Fang, Zhiqiang Dong, Shao Zhang, Shilei Lin, Chengzhong Heliyon Research Article Augmented reality (AR) surgery navigation systems display the pre-operation planned virtual model at the accurate position in the real surgical scene to assist the operation. Accurate calibration of the mapping relationship between the virtual coordinate and the real world is the key to the virtual-real fusion effect. Former calibration methods require the doctor user to conduct complex manual procedures before usage. This paper introduces a novel motionless virtual-real calibration method. The method only requires to take a mixed reality image containing both virtual and real marker balls using the built-in forward camera of the AR glasses. The mapping relationship between the virtual and real spaces is calculated by using the camera coordinate system as a transformation medium. The composition and working process of the AR navigation system is introduced, and then the mathematical principle of the calibration is designed. The feasibility of the proposed calibration scheme is verified with a verification experiment, and the average registration accuracy of the scheme is around 5.80mm, which is of same level of formerly reported methods. The proposed method is convenient and rapid to implement, and the calibration accuracy is not dependent on the user experience. Further, it can potentially realize the real-time update of the registration transformation matrix, which can improve the AR fusion accuracy when the AR glasses moves. This motionless calibration method has great potential to be applied in future clinical navigation research. Elsevier 2022-12-09 /pmc/articles/PMC9801086/ /pubmed/36590529 http://dx.doi.org/10.1016/j.heliyon.2022.e12115 Text en © 2022 The Author(s) https://creativecommons.org/licenses/by-nc-nd/4.0/This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).
spellingShingle Research Article
Wan, Xinjun
Shen, Lizhengyi
Fang, Zhiqiang
Dong, Shao
Zhang, Shilei
Lin, Chengzhong
A novel motionless calibration method for augmented reality surgery navigation system based on optical tracker
title A novel motionless calibration method for augmented reality surgery navigation system based on optical tracker
title_full A novel motionless calibration method for augmented reality surgery navigation system based on optical tracker
title_fullStr A novel motionless calibration method for augmented reality surgery navigation system based on optical tracker
title_full_unstemmed A novel motionless calibration method for augmented reality surgery navigation system based on optical tracker
title_short A novel motionless calibration method for augmented reality surgery navigation system based on optical tracker
title_sort novel motionless calibration method for augmented reality surgery navigation system based on optical tracker
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9801086/
https://www.ncbi.nlm.nih.gov/pubmed/36590529
http://dx.doi.org/10.1016/j.heliyon.2022.e12115
work_keys_str_mv AT wanxinjun anovelmotionlesscalibrationmethodforaugmentedrealitysurgerynavigationsystembasedonopticaltracker
AT shenlizhengyi anovelmotionlesscalibrationmethodforaugmentedrealitysurgerynavigationsystembasedonopticaltracker
AT fangzhiqiang anovelmotionlesscalibrationmethodforaugmentedrealitysurgerynavigationsystembasedonopticaltracker
AT dongshao anovelmotionlesscalibrationmethodforaugmentedrealitysurgerynavigationsystembasedonopticaltracker
AT zhangshilei anovelmotionlesscalibrationmethodforaugmentedrealitysurgerynavigationsystembasedonopticaltracker
AT linchengzhong anovelmotionlesscalibrationmethodforaugmentedrealitysurgerynavigationsystembasedonopticaltracker
AT wanxinjun novelmotionlesscalibrationmethodforaugmentedrealitysurgerynavigationsystembasedonopticaltracker
AT shenlizhengyi novelmotionlesscalibrationmethodforaugmentedrealitysurgerynavigationsystembasedonopticaltracker
AT fangzhiqiang novelmotionlesscalibrationmethodforaugmentedrealitysurgerynavigationsystembasedonopticaltracker
AT dongshao novelmotionlesscalibrationmethodforaugmentedrealitysurgerynavigationsystembasedonopticaltracker
AT zhangshilei novelmotionlesscalibrationmethodforaugmentedrealitysurgerynavigationsystembasedonopticaltracker
AT linchengzhong novelmotionlesscalibrationmethodforaugmentedrealitysurgerynavigationsystembasedonopticaltracker