Abstract
Agent-based models (ABMs) have enabled great advances in the study of tumor development and therapeutic response, allowing researchers to explore the spatiotemporal evolution of the tumor and its microenvironment. However, these models face serious drawbacks in the realm of parameterization - ABM parameters are typically set individually based on various data and literature sources, rather than through a rigorous parameter estimation approach. While ABMs can be fit to simple time-course data (such as tumor volume), that type of data loses the spatial information that is a defining feature of ABMs. While tumor images provide spatial information, it is exceedingly difficult to compare tumor images to ABM simulations beyond a qualitative visual comparison. Without a quantitative method of comparing the similarity of tumor images to ABM simulations, a rigorous parameter fitting is not possible. Here, we present a novel approach that applies neural networks to represent both tumor images and ABM simulations as low dimensional points, with the distance between points acting as a quantitative measure of difference between the two. This enables a quantitative comparison of tumor images and ABM simulations, where the distance between simulated and experimental images can be minimized using standard parameter-fitting algorithms. Here, we describe this method and present two examples to demonstrate the application of the approach to estimate parameters for two distinct ABMs. Overall, we provide a novel method to robustly estimate ABM parameters.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.