Molecular property prediction with deep learning often employs self-supervised learning techniques to learn common knowledge through masked atom prediction. However, the common knowledge gained by masked atom prediction dramatically differs from the graph-level optimization objective of downstream tasks, which results in suboptimal problems. Particularly for properties with limited data, the failure to consider domain knowledge results in a direct search in an immense common space, rendering it infeasible to identify the global optimum. To address this, we propose MPCD, which enhances pretraining transferability by aligning the optimization objectives between pretraining and fine-tuning with domain knowledge. MPCD also leverages multitask learning to improve data utilization and model robustness. Technically, MPCD employs a relation-aware self-attention mechanism to capture molecules' local and global structures comprehensively. Extensive validation demonstrates that MPCD outperforms state-of-the-art methods for absorption, distribution, metabolism, excretion, and toxicity (ADMET) and physicochemical prediction across various data sizes.
Read full abstract