Abstract

As a noninvasive and radiation-free imaging modality, electrical impedance tomography (EIT) has attracted much attention in the last two decades and owns many industry and biomedical applications. However, due to the nonlinearity and ill-posedness of its inverse problem, the EIT images always suffer from low spatial resolution and are sensitive to the modeling errors. To achieve high resolution and modeling error robust EIT image, a two-stage deep learning (TSDL) method is proposed. The proposed method consists of a prereconstruction block and a convolutional neural network (CNN). The prereconstruction block learns the regularization pattern from the training data set and provides a rough reconstruction of the target. The CNN postprocesses the prereconstruction result in a multilevel feature analysis strategy and eliminates the modeling errors with prior information of the observation domain shape. The prereconstruction and CNN blocks are trained together by using a minimum square approach. To evaluate the performance of the TSDL method, the lung EIT problem was studied. The training data set is calculated from more than 100 000 EIT simulation models generated from computed tomography (CT) scans across 792 patients. Lung injury, measurement noise, and model errors are randomly simulated during the model generation process. The trained TSDL model is evaluated with simulation testes, as well as the experimental tests from a laboratory setting. According to the results, the TSDL method could achieve high accuracy shape reconstructions and is robust against measurement noise and modeling errors.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.