Abstract

Abstract Machine learning tasks for edges and nodes in networks heavily rely on feature engineering which requires expert knowledge and careful effort. Recent years, people become interested in the low dimensional vector representation of nodes and edges. However, existing methods on signed networks only aim to learn the node vectors, resulting in omitting edge information and extra effort to design edge vectors. In this work, we develop a framework for learning both nodes and edge vectors for signed networks. Thus, we can directly use edge vectors to represent the properties of the edges, and thereby improving the performance of link-oriented tasks. Our framework for learning network features is as below. We assume that there is a global mapping between the node and edge vector spaces. This assumption allows us to transform the problem into learning the mapping function and the node vectors. We propose node proximity for signed networks, a definition that is generalized from the second-order node proximity for unsigned networks. It provides a unified objective function that can preserve both the node and edge pattern of the network. Based on this definition, we propose two signed network representation methods. The first method is neural network signed network embedding (nSNE). It learns the node vectors and the mapping function via neural networks approach, which can uses the power of deep learning to fit with the data. The second method is light signed network embedding (lSNE). It specifies the mapping function as simply and linear function. It has fewer parameters to estimate and is equal to factorize both similarity and sign matrixes. We compare our methods with three state-of-the-art methods on four datasets. The results show that our methods are competitive.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.