Abstract

Majority of medical practitioners use C-arm fluoroscopy in transrectal ultrasound (TRUS) guided prostate brachytherapy, but only in a qualitative manner. The ability to register the implanted seeds (that are visible in fluoroscope) to soft tissue anatomy (that is visible in TRUS) intra-operatively will allow us to make immediate provisions for dosimetric deviations from the optimal implant plan. The three major obstacles we face are: (a) discerning the 3D pose of the fluoro images, (b) registering fluoro space to TRUS space, and (c) reconstruct the position of seeds from multiple fluoro images. We have addressed the first two issues by a novel external fiducial structure [Jain et al, submitted AAPM 2004]. The missing link, intra-operative seed reconstruction from C-arm fluoroscope images, is addressed here. A brute-force formalization of the seed-matching problem results in a high complexity search space which is of the order of 10150 and 10300 for 2 and 3 fluoroscopic images respectively. Hence previously proposed seed-matching approaches have predominantly been heuristic explorations of the search space [Todor et al, PMB 2002:47; Narayanan et al, Med. Phy. 2002:29; Tubic et el, Med. Phy. 2001:28; Su et al, MICCAI 2003], with no theoretical assurance on the accuracy of the answer. We convert seed-matching to a specific form of combinatorial optimization. Our formulation has many salient features: (a) exact solutions studied extensively by computer science community, (b) performance claims on the space-time complexity of the algorithm, (c) optimality bounds on the final solution, (d) guaranteed existence of a polynomial time solution for the global minima for seed-matching from 2 images, (e) proof of the non-existence of a polynomial time solution in case of more than 2 images, (f) derivation of a practical solution that can work near polynomial time on any number of images. A network flow formulation is used to model the seed-matching problem, where any flow in the network would represent a seed-matching, the desired solution being the flow with minimum cost. Primal-dual algorithms to solve such min-cost flow problems have been known from the literature. Furthermore, the min-cost flow problem for the case of 2 and 3 fluoroscope images reduces to the bipartite and tripartite matching problems, which have fast implementations available. Although 2 images had been known to be insufficient for seed reconstruction, the amount of inherent error was not known, the analysis of which became possible with our polynomial-time algorithm. A third image renders a generic reconstruction problem to be of non-polynomial complexity, yet we solved it in near polynomial time utilizing its special structure. Table 1(table 1) shows performance on synthetic data, where neighboring seeds are separated by 1cm within a volume of 5 × 3 × 3cm. The inherent imprecision in the two-image case is indicated by the low back-projection error and high reconstruction error for the unmatched seeds, which is adequately resolved with a third image. Note that in the case of overlapping seeds, any flip in the matching though counted as erroneous, still reconstructs the seeds in the correct 3D locations. Runtime for most cases was under a minute on a PC (900MHz, 512MB RAM) running Matlab on Windows 2000. The network flow approach appears to be sufficiently robust and fast on synthetic data for a large number of implanted seeds. The algorithm is being perfected to increase robustness to noise and accommodate for seed orientation. Experimental validation is currently being carried out using a precision machined mechanical phantom, in which seeds can be placed at various a priory known positions. Further results would be made available during the conference

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.