Overhead Impacts on Long-Term Evolution Radio Networks

University essay from KTH/Kommunikationssystem, CoS

Abstract: As a result of the constant efforts to improve mobile system performance and spectral efficiency, the 3GPP standardization forum is currently defining new architectural and functional requirements that hope to ensure long-term evolution (specifically defined as the “Long-Term Evolution (LTE) concept”) and general future competitiveness of the 2G and 3G radio access technologies. Previous discussions on LTE efficiency have been focused on general assumptions on signaling overhead and overall system capacity, based on experience from existing mobile systems. However, as 3GPP standardization has become more mature (although not yet settled), there is a need to investigate how different potential LTE services will be affected by the use of available overhead information and basic scheduling algorithms. This thesis investigates the lower protocol layers’ overhead impacts on the downlink for different packet switched services, in an LTE radio access network (RAN). Results show that the use of RTP/TCP/IP header compression (ROHC) is the single most important factor to reduce payload overhead, for packet sizes of ~1kB or smaller. However, for packets larger than ~1 kB, the use of ROHC becomes insignificant. Protocol headers – including the AMR frame header, RLC/MAC headers, and CRC where applicable – remain the largest part of payload overhead regardless of packet size and header compression (ROHC). For VoIP over the UDP protocol (with ROHC), RLC/MAC headers constitute the largest part of protocol headers. For TCP/IP applications (without ROHC), TCP/IP headers are predominant. Services that require packet sizes beyond ~1 kB will require about the same power per payload bit regardless of percentage of payload overhead.

  AT THIS PAGE YOU CAN DOWNLOAD THE WHOLE ESSAY. (follow the link to the next page)