Abstract
Aggregation functions are regarded as the multiplication between an aggregation matrix and node embeddings, based on which a full rank matrix can enhance representation capacity of Graph Neural Networks (GNNs). In this work, we fill this research gap based on the full rank aggregation matrix and its functional form, i.e., the injective aggregation function, and state that injectivity is necessary to guarantee the rich representation capacity to GNNs. To this end, we conduct theoretical injectivity analysis for the typical feature aggregation methods and provide inspiring solutions on turning the non-injective aggregation functions into injective versions. Based on our injective aggregation functions, we create various GNN structures by combining the aggregation functions with the other ingredient of GNNs, node feature encoding, in different ways. Following these structures, we highlight that by using our injective aggregation function entirely as a pre-processing step before applying independent node feature learning, we can simultaneously achieve satisfactory performance and computational efficiency on the large-scale graph-based traffic data for traffic state prediction tasks. Through comprehensive experiments on standard node classification benchmarks and practical traffic state data (for Chengdu and Xi’an cities), we show that the representation capacity of GNNs can be improved by using our injective aggregation functions just by changing the model in simple operations.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.