Abstract

Purpose: Neurosurgical laser ablation is experiencing a renaissance. Computational tools for ablation planning aim to further improve the intervention. Here, global optimisation and inverse problems are demonstrated to train a model that predicts maximum laser ablation extent.Methods: A closed-form steady state model is trained on and then subsequently compared to N = 20 retrospective clinical MR thermometry datasets. Dice similarity coefficient (DSC) is calculated to provide a measure of region overlap between the 57 °C isotherms of the thermometry data and the model-predicted ablation regions; 57 °C is a tissue death surrogate at thermal steady state. A global optimisation scheme samples the dominant model parameter sensitivities, blood perfusion (ω) and optical parameter (μeff) values, throughout a parameter space totalling 11 440 value-pairs. This represents a lookup table of μeff–ω pairs with the corresponding DSC value for each patient dataset. The μeff–ω pair with the maximum DSC calibrates the model parameters, maximising predictive value for each patient. Finally, leave-one-out cross-validation with global optimisation information trains the model on the entire clinical dataset, and compares against the model naïvely using literature values for ω and μeff.Results: When using naïve literature values, the model’s mean DSC is 0.67 whereas the calibrated model produces 0.82 during cross-validation, an improvement of 0.15 in overlap with the patient data. The 95% confidence interval of the mean difference is 0.083–0.23 (p < 0.001).Conclusions: During cross-validation, the calibrated model is superior to the naïve model as measured by DSC, with +22% mean prediction accuracy. Calibration empowers a relatively simple model to become more predictive.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call