Abstract

Real-time tracking of a target volume is a promising solution for reducing the planning margins and both dosimetric and geometric uncertainties in the treatment of thoracic and upper-abdomen cancers. Respiratory motion prediction is an integral part of real-time tracking to compensate for the latency of tracking systems. The purpose of this work was to develop a novel method for accurate respiratory motion prediction using dual deep recurrent neural networks (RNNs). The respiratory motion data of 111 patients were used to train and evaluate the method. For each patient, two models (Network1 and Network2) were trained on 80% of the respiratory wave, and the remaining 20% was used for evaluation. The first network (Network 1) is a ‘coarse resolution’ prediction of future points and second network (Network 2) provides a ‘fine resolution’ prediction to interpolate between the future predictions. The performance of the method was tested using two types of RNN algorithms : Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU). The accuracy of each model was evaluated using the root mean square error (RMSE) and mean absolute error (MAE). Overall, the RNN model with GRU- function had better accuracy than the RNN model with LSTM-function (RMSE (mm): 0.4 ± 0.2 versus 0.6 ± 0.3; MAE (mm): 0.4 ± 0.2 versus 0.6 ± 0.2). The GRU was able to predict the respiratory motion accurately (<1 mm) up to the latency period of 440 ms, and LSTM’s accuracy was acceptable only up to 240 ms. The proposed method using GRU function can be used for respiratory-motion prediction up to a latency period of 440 ms.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.