Cargando…
Augmented Reality monitoring of robot-assisted intervention in harsh environments at CERN
An architecture of human-robot navigation system, based on ultra-wideband positioning, twofold ultrasonic sensors for heading, and an augmented reality smart-glasses interface, is presented. The position is obtained by a trilateration algorithm based on Extended Kalman Filter, and the heading by fus...
Autores principales: | , , , , |
---|---|
Lenguaje: | eng |
Publicado: |
2018
|
Materias: | |
Acceso en línea: | https://dx.doi.org/10.1088/1742-6596/1065/17/172008 http://cds.cern.ch/record/2681179 |
Sumario: | An architecture of human-robot navigation system, based on ultra-wideband positioning, twofold ultrasonic sensors for heading, and an augmented reality smart-glasses interface, is presented. The position is obtained by a trilateration algorithm based on Extended Kalman Filter, and the heading by fusing the ultra-wideband position with the phase difference measured by the ultrasonic system. The phase difference received at the ultrasonic sensor is extract using the three parameter sine fitting algorithm. For this application in the CERN tunnel of the Large Hadron Collider, the inspection robot precedes the human during the navigation in the harsh environment, and collects temperature, oxygen percentage, and radiation level. The environment measurements are displayed by the smart-glasses at the operator, and in case of a dangerous condition, the operator is warned by the augmented reality interface. The navigation and monitoring system allows to maintain safety the relative human-robot position. Preliminary simulation results of the positioning and heading system are discussed to validate the main idea. |
---|