Recent advancements in deep learning have revolutionized the field of drug discovery, with Transformer-based models emerging as powerful tools for molecular design and property prediction. However, the lack of explainability in such models remains a significant challenge. In this study, we introduce ABIET (Attention-Based Importance Estimation Tool), an explainable Transformer model designed to identify the most critical regions for drug-target interactions - functional groups (FGs) - in biologically active molecules. Functional groups play a pivotal role in determining chemical behavior and biological interactions. Our approach leverages attention weights from Transformer-encoder architectures trained on SMILES representations to assess the relative importance of molecular subregions. By processing attention scores using a specific strategy - considering bidirectional interactions, layer-based extraction, and activation transformations - we effectively distinguish FGs from non-FG atoms. Experimental validation on diverse datasets targeting pharmacological receptors, including VEGFR2, AA2A, GSK3, JNK3, and DRD2, demonstrates the model's robustness and interpretability. Comparative analysis with state-of-the-art gradient-based and perturbation-based methods confirms ABIET's superior performance, with functional groups receiving statistically higher importance scores. This work enhances the transparency of Transformer predictions, providing critical insights for molecular design, structure-activity analysis, and targeted drug development.
Read full abstract