TechRxiv
MixedRealityHumanTeleoperation.pdf (494.98 kB)

Human Teleoperation - A Haptically Enabled Mixed Reality System for Teleultrasound

Download (494.98 kB)
preprint
posted on 19.08.2021, 10:51 by David BlackDavid Black, Yas Oloumi Yazdi, Amir Hossein Hadi Hosseinabadi, Septimiu Salcudean

Current teleguidance methods include verbal guidance and robotic teleoperation, which present tradeoffs between precision and latency versus flexibility and cost. We present a novel concept of "human teleoperation" which bridges the gap between these two methods. A prototype teleultrasound system was implemented which shows the concept’s efficacy. An expert remotely "teloperates" a person (the follower) wearing a mixed reality headset by controlling a virtual ultrasound probe projected into the person’s scene. The follower matches the pose and force of the virtual device with a real probe. The pose, force, video, ultrasound images, and 3-dimensional mesh of the scene are fed back to the expert. In this control framework, the input and the actuation are carried out by people, but with near robot-like latency and precision. This allows teleguidance that is more precise and fast than verbal guidance, yet more flexible and inexpensive than robotic teleoperation. The system was subjected to tests that show its effectiveness, including mean teleoperation latencies of 0.27 seconds and errors of 7 mm and 6◦ in pose tracking. The system was also tested with an expert ultrasonographer and four patients and was found to improve the precision and speed of two teleultrasound procedures.

History

Email Address of Submitting Author

dgblack@ece.ubc.ca

ORCID of Submitting Author

0000-0001-6907-9851

Submitting Author's Institution

University of British Columbia

Submitting Author's Country

Canada