Abstract

The performance of ultrasound elastography (USE) heavily depends on the accuracy of displacement estimation. Recently, convolutional neural networks (CNNs) have shown promising performance in optical flow estimation and have been adopted for USE displacement estimation. Networks trained on computer vision images are not optimized for USE displacement estimation since there is a large gap between the computer vision images and the high-frequency radio frequency (RF) ultrasound data. Many researchers tried to adopt the optical flow CNNs to USE by applying transfer learning to improve the performance of CNNs for USE. However, the ground-truth displacement in real ultrasound data is unknown, and simulated data exhibit a domain shift compared to the real data and are also computationally expensive to generate. To resolve this issue, semisupervised methods have been proposed in which the networks pretrained on computer vision images are fine-tuned using real ultrasound data. In this article, we employ a semisupervised method by exploiting the first- and second-order derivatives of the displacement field for regularization. We also modify the network structure to estimate both forward and backward displacements and propose to use consistency between the forward and backward strains as an additional regularizer to further enhance the performance. We validate our method using several experimental phantom and in vivo data. We also show that the network fine-tuned by our proposed method using experimental phantom data performs well on in vivo data similar to the network fine-tuned on in vivo data. Our results also show that the proposed method outperforms current deep learning methods and is comparable to computationally expensive optimization-based algorithms.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.