loading page

Human Teleoperation - A Haptically Enabled Mixed Reality System for Teleultrasound
  • +1
  • David Black ,
  • Yas Oloumi Yazdi ,
  • Amir Hossein Hadi Hosseinabadi ,
  • Septimiu Salcudean
David Black
University of British Columbia, University of British Columbia

Corresponding Author:[email protected]

Author Profile
Yas Oloumi Yazdi
Author Profile
Amir Hossein Hadi Hosseinabadi
Author Profile
Septimiu Salcudean
Author Profile


Current teleguidance methods include verbal guidance and robotic teleoperation, which present tradeoffs between precision and latency versus flexibility and cost. We present a novel concept of “human teleoperation” which bridges the gap between these two methods. A prototype teleultrasound system was implemented which shows the concept’s efficacy. An expert remotely “teloperates” a person (the follower) wearing a mixed reality headset by controlling a virtual ultrasound probe projected into the person’s scene. The follower matches the pose and force of the virtual device with a real probe. The pose, force, video, ultrasound images, and 3-dimensional mesh of the scene are fed back to the expert. In this control framework, the input and the actuation are carried out by people, but with near robot-like latency and precision. This allows teleguidance that is more precise and fast than verbal guidance, yet more flexible and inexpensive than robotic teleoperation. The system was subjected to tests that show its effectiveness, including mean teleoperation latencies of 0.27 seconds and errors of 7 mm and 6◦ in pose tracking. The system was also tested with an expert ultrasonographer and four patients and was found to improve the precision and speed of two teleultrasound procedures.
30 Jun 2023Published in Human–Computer Interaction on pages 1-24. 10.1080/07370024.2023.2218355