Abstract Link prediction is essential for identifying hidden relationships within network data, with significant implications for fields such as social network analysis and bioinformatics. Traditional methods often overlook potential relationships among common neighbors, limiting their effectiveness in utilizing graph information fully. To address this, we introduce a novel approach, Common Neighbor Completion with Information Entropy (IECNC), which enhances model expressiveness by considering logical neighbor relationships. Our method integrates a dynamic node function with a Message Passing Neural Network (MPNN), focusing on first-order neighbors and employing set-based aggregation to improve missing link predictions. By combining the information entropy of probabilistic predictions of common neighbors with MPNN and leveraging information entropy to assess uncertainty in adjacent connections, our approach significantly enhances prediction accuracy. Experimental results demonstrate that our IECNC method achieves optimal performance across multiple datasets, surpassing existing techniques. Furthermore, visualizations confirm that our model effectively captures and accurately learns feature information from various categories, Demonstrating the method’s efficacy and adaptability.
Read full abstract