Abstract

Conventional ultrasound (US) machines employ a physical control panel (PCP) as the primary user interface for machine control. This panel is adjacent to the main machine display that requires the operator's constant attention. The switch of attention to the control panel can lead to interruptions in the flow of the medical examination. Some ultraportable machines also lack many physical controls. Furthermore, the need to both control the US machine and observe the US image may lead the practitioners to adopt unergonomic postures and repetitive motions that can lead to work-related injuries. Therefore, there is a need for a more efficient human-computer interaction method on US machines. To tackle some of the limitations with the PCP, we propose to merge the PCP into the main screen of the US machines. We propose to use gaze tracking and a handheld controller so that machine control can be achieved via a multimodal human-computer interaction (HCI) method that does not require one to touch the screen or look away from the US image. As a first step, a pop-up menu and measurement tool were designed on top of the US image based on gaze position for efficient machine control. A comparative study was performed on the BK Medical SonixTOUCH US machine. Participants were asked to complete the task of measuring the area of an ellipse-shaped tumor in a phantom using our gaze-supported HCI method as well as the traditional method. The user study indicates that the task completion time can be reduced by [Formula: see text] when using our gaze-supported HCI, while no extra workload is imposed on the operators. Our preliminary study suggests that, when combined with a simple handheld controller, eye gaze tracking can be integrated into the US machine HCI for more efficient machine control.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call