Abstract

Accurate predictions of molecular properties are crucial for advancements in drug discovery and materials science. However, this task is complex and requires effective representations of molecular structures. Recently, Graph Neural Networks (GNNs) have emerged as powerful tools for this purpose, demonstrating significant potential in modeling molecular data. Despite advancements in GNN predictive performance, existing methods lack clarity on how architectural choices, particularly activation functions, affect training dynamics and inference stages in interpreting the predicted results. To address this gap, this paper introduces a novel activation function called the Sine Linear Unit (SLU), aimed at enhancing the predictive capabilities of GNNs in the context of molecular property prediction. To demonstrate the effectiveness of SLU within GNN architecture, we conduct experiments on diverse molecular datasets encompassing various regression and classification tasks. Our findings indicate that SLU consistently outperforms traditional activation functions on hydration free energy (FreeSolv), inhibitory binding of human β secretase (BACE), and blood brain barrier penetration (BBBP), achieving the superior performance in each task, with one exception on the GCN model using the QM9 data set. These results underscore SLU’s potential to significantly improve prediction accuracy, making it a valuable addition to the field of molecular modeling.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.