Abstract

The development of high-throughput, data-intensive biomedical research assays and technologies has created a need for researchers to develop strategies for analyzing, integrating, and interpreting the massive amounts of data they generate. Although a wide variety of statistical methods have been designed to accommodate 'big data,' experiences with the use of artificial intelligence (AI) techniques suggest that they might be particularly appropriate. In addition,theresults of the application ofthese assaysreveal a greatheterogeneity in the pathophysiologicfactors and processes that contribute to disease,suggesting that there is a need to tailor, or 'personalize,' medicines to the nuanced and often unique features possessed by individual patients. Given how important data-intensive assays are to revealing appropriate intervention targets and strategies for treating an individual with a disease, AI can play an important role in the development of personalized medicines. We describe many areas where AI can play sucha role and argue that AI's ability to advance personalized medicine will depend critically on notonlythe refinement of relevant assays, but also onways of storing, aggregating, accessing, and ultimately integrating, the data they produce. We also point out the limitations of many AI techniques in developing personalized medicinesas well as consider areas for further research.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call