Abstract

The analysis of large-scale networks requires the parallel techniques of graph processing. Hadoop as an open-source version of Map/Reduce implementation gains its popularity by high efficiency, scalability and fault tolerance. However, Map/Redeuce as a simplified programming model tends to be used in applications with massive datasets and simple processing. In this paper, we aim to adapt Map/Reduce programming in more complex applications such as community detection in large-scale networks. We present a new model, LI-MR (Local Iteration Map/Reduce), to resolve the Map/Reduce model's problems in respect of multi-iteration and random data access. A new system called LI-Hadoop is built to implement LI-MR model based on Hadoop. Furthermore, we propose a new algorithm MR-LPA, which parallelizes LPA in order to mine community structure in large-scale networks. We evaluate the performance of LI-Hadoop by executing MR-LPA on real-world datasets. The experimental results show that our approach is both effective and efficient.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.