David Black

and 4 more

David Black

and 2 more

Purpose: In “human teleoperation” [1], augmented reality (AR) and haptics are used to tightly couple an expert leader to a human follower. To determine the feasibility of human teleoperation, we quantify the abil- ity of humans to track a position and/or force trajectory via AR cues. The human response time, precision, frequency response, and step response were characterized, and several rendering methods were compared. Methods: Volunteers (n=11) performed a series of tasks as the fol- lower in our human teleoperation system. The tasks involved tracking pre-recorded series of motions and forces, each time with a different rendering method. The order of tasks and rendering methods was ran- domized to avoid learning effects and bias. The volunteers then performed a series of frequency response tests and filled out a questionnaire. Results: Rendering the full ultrasound probe as a position target with an error bar displaying force led to the best position and force tracking. Following force and pose simultaneously was more difficult but did not lead to significant performance degradation versus following one at a time. On average, subjects tracked positions, orientations, and forces with rms tracking errors of 6.2 ± 1.9 mm, 5.9 ± 1.9˚, 1.0 ± 0.3 N, steady-state errors of 2.8 ± 2.1 mm, 0.26 ± 0.2 N, and lags of 345.5 ± 87.6 ms respectively. Performance decreased with input frequency, until the person could no longer follow, depending on the input amplitude. Conclusion: This paper characterizes human tracking ability in aug- mented reality human teleoperation, which shows the system’s feasi- bility and good performance, and is important for designing future human computer interfaces using augmented and virtual reality.

David Black

and 7 more

Purpose Respiratory motion during positron emission tomography (PET) scans can be a major detriment to image quality in oncological imaging, leading to loss of quantification accuracy and false negative findings. The impact of motion on lesion quantification and detectability can be assessed using anthropomorphic phantoms with realistic anatomy representation and motion modelling. In this work we design and build such a phantom, with careful consideration of system requirements and detailed force analysis. Methods: We start from a previously-developed anatomically-accurate shell of a human torso and add elastic lungs with a highly controllable actuation mechanism which replicates the physics of breathing. The space outside the lungs is filled with a radioactive water solution. To maintain anatomical accuracy in the torso and realistic gamma ray attenuation, all motion mechanisms and actuators are positioned outside of the phantom compartment. The actuation mechanism can produce a plethora of custom respiratory waveforms with breathing rates up to 25 breaths per minute and tidal volumes up to 1200mL. Results: Several tests were performed to validate the performance of the phantom assembly, in which the phantom was filled with water and given respiratory waveforms to execute. All parts demonstrated nominal performance. Force requirements were not exceeded and no leaks were detected, although continued use of the phantom is required to evaluate wear. The respiratory motion was determined to be within a reasonable realistic range. Conclusions: The full mechanical design is described in this paper, as well as a software application with graphical user interface which was developed to plan and visualize respiratory patterns. Both are available open source and linked in this paper. The developed phantom will facilitate future work in evaluating the impact of respiratory motion on lesion quantification and detectability.