The energy utilization of green cellular networking or cellular network infrastructure has become a popular research domain due to the economic concern of network operators and global climate change. Also, due to the growth of wireless networks, the authors are attracted to finding novel relay station (RS) deployment techniques to support high user density and high data rate services by reducing energy consumption and operating costs of different network elements. The location of RSs greatly influences how far a cell can be covered. Therefore, the deployment RS plays a major role in widening the cell's coverage radius. To solve these challenges and to realize greener network designs by maximizing power consumption with QoS assurance, a novel traffic-based power consumption minimization and node deployment for the LTE green cellular networks is proposed. Also, key performance indicators (KPIs) are required to monitor and improve network performance. KPIs control the accomplished resource utilization and the quality of services. The research work comprises RS deployment, traffic estimation, and power allocation. Power losses, communication delays, increased implementation costs, and decreased throughput are the effects of an inappropriate RS deployment. Therefore, this research introduces the transmission area-based relay stations deployment scheme (TARSD) with multiobjective functions maximization of coverage area, minimization of the overlapping area and the minimization of power consumption cost. Traffic estimation helps to identify future capacity requirements and enables improved planning and decision-making. The deployment scheme of a cellular network relies on network traffic estimation. Nash traffic entropy learning algorithm (NTEL) is used to estimate traffic, and based on the estimated eNB switching ON/OFF is performed to minimize power consumption. Inter cell interference (ICI) can impact the performance of the LTE network. Several power allocation schemes can be used to improve system performance to achieve the optimal trade-off between interference and signal-to-noise ratio (SINR). Because of its good performance and efficacy in improving the mutual information between a channel's input and output, the enhanced low complexity water-filling (ELCWF) has been widely employed in power allocation with its increment and decrement properties. This research studies the statistical behaviour of a Received Signal Strength Indicator (RSSI), Reference Signal Received Power (RSRP), Reference Signal Received Quality (RSRQ) and Signal to Interference noise Ratio (SINR). Also, statistical analysis of cost versus coverage performance is examined. Under the 7th simulation run, the proposed method yields the best coverage of 88.81 and 98.74%, respectively. In the experimental scenario, for the proposed model, the deployment cost for BS and RS are 27 and 13 units, the average throughput per user is 9.74 Mbps, and the coverage ratio is 97% under the BS candidate location of 20, respectively. In the power allocation stage, if the number of users is 50, the proposed model yields a power consumption of 0.2 W, throughput of 23.4 Mbps, delay of 15 ms and SINR of 20.12 dB, respectively.
Read full abstract