Abstract
The acquisition of fetal biometric measurements via 2-D B-Mode ultrasound (US) scans is crucial for fetal monitoring. However, acquiring standardised head, abdominal and femoral image planes is challenging due to variable image quality. There remains a significant discrepancy between the way automated computer vision algorithms and human sonographers perform this task; this paper contributes to the attempt to bridge this gap by building knowledge of US image perception into a pipeline for classifying images obtained during 2-D fetal US scans. We record the eye movements of 10 participants performing 4 2-D US scans each, on a phantom fetal model at varying orientations. We analyse their eye movements to establish which high-level constraints and visual cues are used to localise the standardised abdominal plane. We then build a vocabulary of visual words trained on SURF descriptors extracted around eye fixations, and use the resulting bag of words model to classify head, abdominal and femoral image frames acquired during 10 clinical US scans and 10 further phantom US scans. On phantom data, we achieve classification accuracies of 89%, 87% and 85% for the head, abdominal and femoral images respectively. On clinical data, we achieve classification accuracies of 76%, 68% and 64% for the head, abdominal and femoral images respectively. This constitutes the first insight into image perception during real time US scanning, and a proof of concept for training bag of words models for US image analysis on human eye movements.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.