Abstract

Hemodynamic parameters can provide surveillance for the risk of complication of abdominal aortic aneurysms following endovascular aneurysm repair (EVAR). However, obtaining hemodynamic parameters through computational fluid dynamics (CFD) has disadvantages of complex operation and high computational costs. Recently proposed physics-informed neural networks offer novel solutions to solve these issues by leveraging fundamental physical conservation principles of fluid dynamics. Based on cardiovascular point datasets, we further propose an integration algorithm combining physics-informed PointNet and quadratic residual networks (PIPN-QN) that is capable of mapping sparse point clouds to four-dimensional hemodynamic parameters. The implemented workflow includes generating point cloud datasets through CFD simulation and dynamically reproducing the three-dimensional flow field in the spatial and temporal dimensions through deep learning. Compared with physics-informed PointNet (PIPN), the PIPN-QN reduces the mean square error of pressure and wall shear stress by around 32.1% and 33.1% and anticipates hemodynamic parameters in less than 2 s (14 400 times faster than CFD). To address the challenge of big data requirements, we quantify the universal flow field using a reduced number of supervision points, as opposed to the large number of point clouds generated from the CFD simulation. The PIPN-QN can meet the real-time hemodynamic parameters obtained from patients with abdominal aortic aneurysms following EVAR with higher accuracy, faster speed, and lower training costs.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.