Remember when the 3G cell phone was a big deal? It might not have been that long ago, but the days of 3G being the fastest network are already behind us.
In December of 2009, Scandinavian Telecom provider Telia Sonera deployed the world’s first 3GPP Long-Term Evolution (LTE) in both Oslo and Stockholm. Technically speaking, LTE is considered a transitional 3.9G technology that evolved from today’s CDMA-based 3.5G and 3.75G cellular standards, including 3GPP High Speed Packet Access (HSPA) and the 3GPP2 Evolution Data-Optimized (EV-DO) rev B standard.
Designed to meet the challenge of faster data rates on cellular networks, the LTE physical layer (PHY) employs several high-throughput digital communications technologies including carrier aggregation, orthogonal frequency-division multiplexing (OFDM), and multiple-input multiple-output (MIMO).With bandwidths of up to 20 MHz, higher-order modulation schemes, such as 64-state quadrature amplitude modulation (64QAM), and up to 4x4 MIMO, the LTE downlink channel is capable of peak download speeds of up to 300 Mbits/s.
However, LTE isn’t the end of the road. By 2013, 3GPP’s LTE-Advanced may actually be the industry’s first truly 4G cellular standard. LTE-Advanced provides an enhanced feature set to the original LTE standard, and each of these features is designed for higher data throughput and better coverage.
For example, LTE-Advanced adds richer MIMO support with up to 8x8 MIMO downlink streams. In addition, LTE Advanced eNB’s (pronounced “e-node-B’s”) can use direction-finding algorithms to measure the angle to a mobile station and then apply beamforming to steer the downlink signal to provide maximum power at that location. Finally, the carrier aggregation feature of LTE-Advanced also enables higher data throughput by allowing a data link to use multiple frequency carriers.
Designing and implementing LTE-Advanced is highly challenging. Its use of carrier aggregation requires more efficient power amplifiers, for instance, necessitating advanced linearization algorithms like digital pre-distortion and envelope tracking. Furthermore, advanced 8x8 MIMO techniques with multi-layer beamforming require highly complex signal detection and decoding algorithms. These design challenges make early and rapid prototyping and validation of complex algorithms necessary so sound architectural design decisions are made early in the development cycle, avoiding costly design mistakes down the road.
The National Instruments graphical system design approach is a rapid prototyping platform to help engineers design LTE-Advanced systems, enabling integration of abstracted signal processing code with modular front-end hardware. At NIWeek 2011, we showcased the world’s first public prototype implementation of an 8x8 MIMO LTE-Advanced PHY on a commercial off-the-shelf platform, achieving a data rate of close to 1 Gbit/s.
The hardware platform comprises a high-performance quad-core Intel i7 multicore processor, NI FlexRIO FPGA processing modules with Xilinx Virtex-5 FPGAs, NI FlexRIO adapter modules for baseband digital-to-analog and analog-to-digital conversion, and PXI RF modulators and converters for over-the-air transmission (see the figure).
This high-performance platform, coupled with LabVIEW system design software and the LabVIEW FPGA Module, helped a small team of communications and signal processing algorithm engineers to prototype a complex LTE-Advanced communication system in six man-months. This project is part of advanced research National Instruments is doing in communications system design.
As part of the program, NI announced a brand new LabVIEW add-on called the LabVIEW DSP Design module. This next-generation design tool helps DSP and communication system designers implement complex, real-time, DSP algorithms by abstracting many low-level FPGA details.
The new LabVIEW module will be available for early-access users starting in early 2012. See this demo in action at www.youtube.com/watch?v=wklxfXGQ_7s.