Abstract

5G cellular network may have the features of smaller cell size, much denser resource deployment and almost random geometric pattern, resulted from diminishing spectrum resource and rapidly increasing diversified communication demands. The random small-cell network results in much more complicated interference scenarios, which cannot be formulated as the well-accepted hexagonal grid model. Therefore, how to model the interference pattern, and how to reuse the scarce spectrum resource to achieve the optimal performance for 5G cellular networks have attracted much attention from both academia and industry. In this paper, a brand new approach, referred to as the matrix graph , is proposed. This approach is robust to interference and random topology. Based on the derived properties of the matrix graph, an asymptotic optimal algorithm with low complexity is obtained to address the spectrum allocation problem with interference constraints, which is known as an NP-hard problem. The proposed algorithm yields a fundamental tradeoff between spectrum reuse ratio and computational complexity. Simulation results also support the theoretical performance gains. As a result, the proposed matrix graph approach is specifically useful for characterizing the next-generation cellular networks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call