Abstract

BackgroundIdentification of drug–target interactions is an indispensable part of drug discovery. While conventional shallow machine learning and recent deep learning methods based on chemogenomic properties of drugs and target proteins have pushed this prediction performance improvement to a new level, these methods are still difficult to adapt to novel structures. Alternatively, large-scale biological and pharmacological data provide new ways to accelerate drug–target interaction prediction.MethodsHere, we propose DrugMAN, a deep learning model for predicting drug–target interaction by integrating multiplex heterogeneous functional networks with a mutual attention network (MAN). DrugMAN uses a graph attention network-based integration algorithm to learn network-specific low-dimensional features for drugs and target proteins by integrating four drug networks and seven gene/protein networks collected by a certain screening conditions, respectively. DrugMAN then captures interaction information between drug and target representations by a mutual attention network to improve drug–target prediction.ResultsDrugMAN achieved the best performance compared with cheminformation-based methods SVM, RF, DeepPurpose and network-based deep learing methods DTINet and NeoDT in four different scenarios, especially in real-world scenarios. Compared with SVM, RF, deepurpose, DTINet, and NeoDT, DrugMAN showed the smallest decrease in AUROC, AUPRC, and F1-Score from warm-start to Both-cold scenarios. This result is attributed to DrugMAN’s learning from heterogeneous data and indicates that DrugMAN has a good generalization ability. Taking together, DrugMAN spotlights heterogeneous information to mine drug–target interactions and can be a powerful tool for drug discovery and drug repurposing.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.