Abstract
The rapid growth in the number of Web APIs makes it difficult for developers to find the appropriate ones. To tackle this issue, researchers have created various powerful automatic approaches for recommendations. Recently, a range of graph neural networks, drawing inspiration from Transformers, have incorporated global attention to enhance recommendations. However, these approaches still have limitations in terms of processing information between nodes and utilizing other valid information, which reduces their effectiveness in large-scale Web API recommendations. To tackle these problems, this paper introduces a novel positional and semantic encoding method and motif-based linearizing graph Transformer for automatic Web API recommendation. We integrate the semantic information of the nodes into the positional information by using a fine-tuned large language model and graph attention mechanism. Furthermore, we leverage motif information to alter the computational sequence of the Vanilla Transformer, achieving linear time complexity. Experimental results on the two real-world datasets demonstrate the suitability of our model for Web API recommendation, surpassing existing state-of-the-art methods. In summary, our proposed technique exhibits promising results for Web API recommendation and underscores the potential of global attention in this field.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.