International Journal of Computer Applications |
Foundation of Computer Science (FCS), NY, USA |
Volume 138 - Number 9 |
Year of Publication: 2016 |
Authors: Mohana H.K., Mohankumar N.M., Suhas K.R., Devaraju J.T. |
10.5120/ijca2016908970 |
Mohana H.K., Mohankumar N.M., Suhas K.R., Devaraju J.T. . Effect of Noise Factor on System Performance in LTE Networks. International Journal of Computer Applications. 138, 9 ( March 2016), 34-37. DOI=10.5120/ijca2016908970
The 3GPP Long Term Evolution (LTE) technology provides higher system throughput over Broadband Wireless Access (BWA) telecommunication systems in order to meet the escalating demands of multimedia services. In such systems, the higher noise factor at the base station (eNB) degrades the system throughput, since the increase in noise factor at the eNB decreases the Signal to Noise Ratio (SNR) of the received signal. Thus the network deployment with lower noise factor at the eNB support higher system throughput and it is essential to provide high Quality of Services (QoS) in LTE networks. Hence in this paper, an attempt has been made to study and evaluate the effect of various noise factors at the eNB on system performance in uplink LTE network using QualNet 7.1 network simulator. The performance metrics considered for the simulation studies are spectral efficiency, system throughput, total numbers of data bytes received, total numbers of transport blocks received with errors, delay and jitter.