Abstract
We present a 3D extension of the Autoprogressive Method (AutoP) for quantitative quasi-static ultrasonic elastography (QUSE) based on sparse sampling of force-displacement measurements. Compared to current model-based inverse methods, our approach requires neither geometric nor constitutive model assumptions. We build upon our previous report for 2D QUSE and demonstrate the feasibility of recovering the 3D linear-elastic material property distribution of gelatin phantoms under compressive loads. Measurements of boundary geometry, applied surface forces, and axial displacements enter into AutoP where a Cartesian neural network constitutive model (CaNNCM) interacts with finite element analyses to learn physically consistent material properties with no prior constitutive model assumption. We introduce a new regularization term uniquely suited to AutoP that improves the ability of CaNNCMs to extract information about spatial stress distributions from measurement data. Results of our study demonstrate that acquiring multiple sets of force-displacement measurements by moving the US probe to different locations on the phantom surface not only provides AutoP with the necessary information for a CaNNCM to learn the 3D material property distribution, but may significantly improve the accuracy of the Young’s modulus estimates. Furthermore, we investigate the trade-offs of decreasing the contact area between the US transducer and phantom surface in an effort to increase sensitivity to surface force variations without additional instrumentation. Each of these modifications improves the ability of CaNNCMs trained in AutoP to learn the spatial distribution of Young’s modulus from force-displacement measurements.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.