Electronic Design
Wireless Companies Follow The Roadmap Past 4G And On to 5G

Wireless Companies Follow The Roadmap Past 4G And On to 5G

It never ends. Once one generation of wireless standards is ratified, work on the next generation begins. This continuing process has led to significant gains in data rate and link reliability, pushing the bounds of wireless communications. And apparently there is more to be done.

We have barely started the fourth generation (4G), and already work has begun on even more advanced standards that may eventually be called 5G, though no one is ready to admit to it. Still, lots of interesting standards work is on going in the IEEE, International Telecommunications Union (ITU), 3rd Generation Partnership Project (3GPP), and other organizations.

What’s 4G, Anyway?

The “generations” reference is related to the different stages of cell-phone systems development. The first generation beginning in the early 1980s was analog FM radio. The second generation saw the development of the first digital cell-phone standards that helped expand subscriber capacity in the same channels. The most important 2G standards that emerged were the Global System for Mobile Communications (GSM) and the first CDMA systems (Fig. 1).

The third generation (3G) brought more CDMA developments like UMTS/WCDMA to boost data rates to as much as 2 Mbits/s. Some variants of 3G provide even greater data rates. These include the more recent 3G additions such as EV-DO Rev. A and B as well as the HSDPA/HDUPA/HSPA+ standards where data rates extend to as much as 21 and 42 Mbits/s.

Today, we’re just entering the 4G realm. Most companies seem to be saying that 4G refers to Long-Term Evolution (LTE), the ITU/3GPP standard generally accepted throughout the world as the next big technology advance in the cellular networks. Some companies also consider WiMAX to be a 4G technology.

The ITU and 3GPP, which usually have a say in these matters, generally regard LTE and WiMAX as 3G technologies. However, the marketing organizations for major wireless carriers call LTE and WiMAX 4G. Even HSPA+, a 3G technology designation, has been called 4G by T-Mobile and AT&T. In any case, HSPA, LTE, and WiMAX are certainly major steps beyond the conventional UTMS/WCDMA and cdma2000 EV-DO standards that we normally see as 3G.

Last October, the ITU stepped in and declared that LTE, EV-DO Rev. A and B, HSPA, and WiMAX are still 3G. The ITU says that 4G is what it calls IMT-Advanced. Of the various standards being considered for real 4G, only two have met the requirements for IMT-Advanced: LTE-Advanced and WiMAX2, also known as WirelessMAN-Advanced or IEEE 802.16m. While these standards have been ratified, they have yet to be implemented, and it will be years before we see them.

Are AT&T, Clearwire, Sprint, T-Mobile, and Verizon falsely advertising their latest technology upgrades as 4G? Maybe. But WiMAX, HSPA, and standard LTE are indeed different as well as a technological step beyond previous WCDMA and cdma2000 EV-DO 3G systems. You could argue that they are 4G. Perhaps the ITU needs to proclaim LTE-Advanced and WiMAX2 as 5G. Otherwise, what is 5G?

Figure 1 shows the evolution of the various standards and how most systems are converging on LTE and WiMAX as 4G. The orthogonal frequency-division multiplexing (OFDM) physical layer (PHY) dominates. Note also the progression of the IEEE 802.11 wireless local-area network (LAN) standards. While they aren’t a formal part of the cellular generation designation, these standards have generally followed their progress with increasing data rates and other features (see “The Future Of Wi-Fi, UWB, And The Less Known Wireless Technologies,” p. xx).

A Look At LTE

Since the cellular industry is mostly focused on deploying LTE systems, a review of its features and capabilities is in order. First, LTE is based on orthogonal frequency-division multiple access (OFDMA). It uses OFDM for the modulation and OFDMA for the access of multiple subscribers in a single channel. LTE also is flexible, as it can be configured to operate in different channel bandwidths. Common channel sizes are 1.4, 3, 5, 10, 15, and 20 MHz.

The number of OFDM subcarriers varies as a function of the bandwidth. Subcarrier spacing or width is 15 kHz. The subcarriers are modulated by quadrature phase-shift keying (QPSK), 16-phase quadrature amplitude modulation (16QAM), or 64-phase QAM (64QAM). The duplexing mode is frequency-division duplex (FDD) where two equal but spaced channels are needed. The standard also defines a time-division duplex (TTD) mode.

LTE has adopted multiple-input multiple-output (MIMO) antenna technology to boost data rates and improve link reliability. It also supports the single-input single-output (SISO) antenna configuration. MIMO configurations include 2x2 (two transmit, two receive paths) and 4x4. Most handsets will use a 1x2 (one transmit, two receive channels) arrangement because of the limited space for antennas and power requirements.

The uplink for LTE uses single-carrier frequency-division multiple access (SC-FDMA). This technology is similar to OFDMA but has a lower peak to average power ratio (PAPR) than OFDMA for greater power efficiency and improved battery life, which are essential in a handset.

As for speed performance, LTE brings a considerable upgrade to the cellular network over even the latest version of HSPA+. Using 64QAM in a SISO configuration, the peak data rate is 100 Mbits/s in a 20-MHz channel. That rate drops to 57.6 Mbits/s with 16QAM and 50 Mbits/s with QPSK. The data rate jumps to a peak of 172.8 Mbits/s using 2x2 MIMO. Although not widely supported, 4x4 MIMO is expected to deliver a peak of 326.4 Mbits/s.

LTE is currently deployed on a relatively small scale worldwide. Most carriers have committed to LTE but implementation is slow due to the high cost of new infrastructure equipment. The trend to adoption is clear even among carriers that previously followed the cdma2000 EV-DO route. In fact, two of the cdma2000 carriers, MetroPCS and Verizon, are the first to implement LTE in the U.S. MetroPCS has LTE service in all of its major metropolitan coverage. It uses the Samsung LTE phones.

Verizon Wireless launched its LTE network in December last year in 39 market areas. Since then, the company has continuously rolled out new LTE coverage. Its plans call for coverage in as many as 175 markets by the end of 2011. Verizon uses the Samsung handsets as well as the newer HTC Thunderbolt (Fig. 2). Users are experiencing downlink data rates of 5 to 12 Mbits/s and 2 to 5 Mbits/s on the uplink, a major improvement over older 3G systems.

U.S. Cellular is also expected to launch LTE in 24 markets by November this year. AT&T plans to begin limited LTE service later in 2011 with greater deployment in 2012. In the meantime, it continues to upgrade its 3G HSPA+ networks and backhaul. Sprint Nextel and Clearwire have already implemented their “4G” network with WiMAX, and that expansion is expected to continue. These carriers may even switch over to LTE.

LTE is expected to gain increasing momentum through the second quarter of 2011. Currently, 12 countries have commercial LTE services. ABI Research projects that by the end of the year, there will be some 16 million subscribers using LTE mobile devices. Now all we need is a good supply of LTE smart phones.

LTE-Advanced

LTE-Advanced is what will become the formal ITU-blessed version of 4G. As an evolved version of LTE, it boosts data rates to 1 Gbit/s by using wider-bandwidth channels and higher-level MIMO schemes. Data rates beyond the theoretical peak of 326.4 Mbytes/s are achieved by using bandwidths of 40 to 100 MHz.

The channel width ideally should be one contiguous segment if possible. But if not, the standard allows non-contiguous segments to be aggregated. With the spectrum limitations that most carriers live with, it is more likely that the non-contiguous aggregation will be the norm. Figure 3 shows four non-contiguous 20-MHz LTE signals using Agilent Technologies’ new LTE-Advanced Signal Studio signal generation software.

LTE-Advanced also defines a maximum of 8x8 MIMO, but it isn’t likely to be widely supported. However, various other MIMO configurations will be more common such as 2x2, 4x4, and 4x2. Handset MIMO is more likely to be 1x2, but up to 4x4 could be used if the antennas can be de-correlated. Separation between antennas is the solution, but that’s difficult in something as small as a handset. Work continues in this area. This standard also is expected to cover:

• Coordinated multipoint (CoMP): This is an arrangement where transmitters do not have to be co-located and can be linked by a high-speed connection of some sort.

• Relaying: Relay stations between the end user and basestation will retransmit downlink and uplink signals to improve coverage. It will also increase data rates and help eliminate dead zones in coverage as well as extend coverage in rural areas.

• Femtocells: These home-based cell sites along with enterprise picocells should further increase coverage areas at lower cost.

LTE-Advanced is not finalized yet, and it will be years before we see it. It will ultimately become one of the technologies along with WiMAX2 to form IMT-Advanced, which is the ITU’s final say on 4G.

WiMAX Remains A Player

As good as it is, WiMAX always seems to be treated as the black sheep of the broadband wireless flock. Yet it appears to be just as good as LTE. Both are OFDM-based with minimal differences. WiMAX, which means Worldwide Interoperability for Microwave Access, is a wireless metropolitan-area network (MAN) system designed for broadband access.

The standard was originally intended to be a wireless alternative to DSL or cable TV Internet connections, but it has become more than that. The IEEE originally standardized it as 802.16 or fixed WiMAX in 2004. A mobile version designated 802.16e was standardized in 2005. The ITU has designated all of these versions as 3G, but the newer 802.16m is a candidate for IMT-Advanced, making it 4G WiMAX.

WiMAX has been deployed for Internet access in more than 580 installations in 150 countries including the U.S. It uses the 2.3-, 2.5-, 3.3-, 3.5-, and 5-GHz bands, with 2.5 and 3.5 GHz being the most common. Clearwire and Sprint Nextel use WiMAX in the 2.5-GHz band in the U.S for wireless Internet access on laptops and cell phones. There’s no doubt WiMAX will ultimately be used in the newer 700-MHz bands. Besides its primary use in broadband Internet access, it is also used in microwave backhaul links for cellular basestations and Wi-Fi hotspots.

The latest and most widely used version of WiMAX, Mobile WiMAX 802.16e, uses a scalable OFDM and OFDMA for access where the number of subcarriers can be 128, 512, 1024, or 2048 depending on the channel bandwidth, which may be 1.25, 5, 10, or 20 MHz. The basic subcarrier spacing is 10.94 kHz. The standard supports antenna diversity, adaptive antennas, and MIMO for improved link reliability and higher data rates.

Turbo coding and low-density parity check forward error correction schemes are used. Access is primarily TDD in mobile applications, though a FDD profile is defined. The modulation is adaptive to the link robustness. It uses binary phase-shift keying (BPSK) for the poorest conditions. QPSK and 16QAM can also be used to boost data rate under better path conditions.

Range and data rate both vary widely depending upon the application. In a fixed-station application, the range can be as great as 30 miles. In mobile applications, stations or cell sites are mainly in the 3- to 10-mile radius. Most mobile cell sites have a range of only 1.5 miles to provide good, reliable connections while maintaining a good, useable data rate. Data rates for individual users in most systems range from 1 to 5 Mbits/s. But in a single-user downlink, a data rate of 128 Mbits/s in a 20-MHz channel with a 56-Mbit/s uplink speed can be achieved.

The IMT-Advanced 802.16m version also known as WiMAX2 is even more broadly defined with frequency coverage down to 450 MHz and through the 5-GHz range. It can achieve 1-Gbit/s speeds in a fixed environment and 100 Mbits/s in a mobile environment. A minimum of 2x2 MIMO is defined for basestations, but they will no doubt use 4x4 or higher plus beamforming as well. A mobile user will have at a minimum of one transmit and two receive signal chains (1x2 MIMO). Modulation is 64QAM in a 20-MHz channel. This is true 4G.

Millimeter Waves And Beyond

As we run out of spectrum space, the pace of wireless progress could slow. But as we have always done, we will move to the higher frequencies where there is more space. It’s already a trend as more and more wireless is pushed beyond microwaves into the millimeter bands from 30 to 300 GHz. With huge spectrum slots available, the potential for common data rates well above 1 Gbit/s is solid. Uncompressed high-definition video can be easily transmitted with space to spare.

Ted Rappaport, a chaired professor at the Wireless Networking and Communications Group (WNCG) at the University of Texas in Austin, believes the future of most wireless is clearly in the millimeter-wave region. In his talk at the IEEE Globecom in Miami last year, he presented his case for the wireless future in this region. The bountiful bandwidth available opens the door to many new wireless applications.

Rappaport also believes the significant challenges presented by these higher frequencies are being overcome, such as the inherent short transmission distances at these frequencies, atmospheric absorption of the signal, especially by oxygen molecules at some frequencies, and the low power of existing semiconductor devices. He believes that most of these issues can be addressed by using high-gain antennas.

A major benefit of these high frequencies is the very small antenna size, (a half wavelength at 60 GHz is 2.5 mm) that permits many antenna elements to be formed on a common structure to create phased arrays with very high gain to boost transmit and receive power. Phased arrays also offer the benefit of adaptive beamforming, which permits avoiding or working around obstacles to the normally desired line-of-sight path at these frequencies.

Such antennas can be formed on printed-circuit boards (PCBs), small substrates, or even large silicon chips. For large equipment, the legacy horn and parabolic antennas offer exceptional high gain and narrow beamwidths. Further range extensions can be achieved with relay repeater points and mesh networks.

Some applications already use the millimeter-wave bands. Microwave point-to-point links are common for interconnecting networks at remote sites. Cellular and other network backhaul use 60 and 80 GHz (Fig. 4). Automobile radar uses the 77-GHz band, and numerous other military and satellite systems also use this range. Unmanned aerial vehicle (UAV) landing systems use a 35-GHz link. Yet the real opportunity lies in more commercial and consumer applications. Chipsets for millimeter wave applications are now becoming more common (Fig. 5).

The IEEE 802.11ad standard uses the 60-GHz band and targets consumer and commercial applications. Faster wireless LAN (WLAN) connections and the wireless transmission of video in home entertainment systems are the initial uses. There are several proposed standards for the 60-GHz wireless space.

Based on the standard developed by the WiGig Alliance last year, IEEE 802.11ad divides the 60-GHz spectrum into four 2.16-GHz wide channels. The PHY may be either OFDM for best link reliability and speed or a single-carrier (SC) format for low power consumption. The SC mode can still achieve up to 4.6-Gbit/s throughput. The OFDM mode can deliver a maximum of 7 Gbits/s.

The media access control (MAC) layer is compatible with previous 802.11 standards. This standard defines some protocol adaption layers (PALs) that let other communications standards ride over the 60-GHz waves. These include DisplayPort and HDMI for uncompressed video and USB and PCI Express for I/O operations. The main target of this standard is video transmission.

Another 60-GHz wireless system called WirelessHD also focuses on consumer video connectivity for HDTV sets, set-top boxes, gaming consoles, HD video displays, DVDs, and DVRs, including those for 3D TV. It uses the 7-GHz wide channel in the 60-GHz industrial, scientific, and medical (ISM) band that extends from 57 to 64 GHz.

WirelessHD permits data rates from a few gigabits per second to as high as 28 Gbits/s. It can easily transmit compressed video or uncompressed video for the highest resolutions available. Available transport data rates are 10.2-Gbit/s HDMI (High Definition Media Interface) or 1.3- and 21.6-Gbit/s DisplayPort 2.1.

Also, WirelessHD uses OFDM with BPSK or QPSK to achieve a data rate up to 4 Gbits/s. Higher data rates are achieved with 16QAM or 64QAM. Support is also provided for Digital Transmission Content Protection (DTCP) encryption (AES128) and High-bandwidth Digital Content Protection (HDCP) for copy protection.

WirelessHD is already available in the form of SiBeam’s OmniLink60 chipset, which incorporates a 32-element steerable phased array antenna on a separate substrate. The gain provides up to 10 W of effective isotropic radiated power (EIRP) that extends non-line-of-sight (NLOS) range to 10 meters.

The WirelessHD specification is now part of the IEEE’s 802.15.3c standard for short-range wireless devices. The 802.15.3c standard is part of the IEEE’s Wireless Personal Area Networking (WPAN) effort, which includes Bluetooth, ZigBee, and other short-range technologies.

HDMI pioneer Silicon Image recently acquired SiBeam, which developed WirelessHD. Silicon Image may establish a formal wireless standard of HDMI using WirelessHD, as reported by market research firm IHS iSuppli.

Overall, the millimeter-wave bands seem ideal for the future of wireless. There’s lots of bandwidth to achieve very high gigabit data rates. This spectrum around 60 GHz is available worldwide for unlicensed use. Yet short-range limitations are a downside, as 60-GHz radios cannot achieve the range of a typical Wi-Fi enabled device, but that isn’t always necessary.

Atmospheric absorption and multipath reflections are continuing problems, though they can be overcome to some extent with higher transmit power and high gain directional and steerable antennas. While CMOS has been used to build 60-GHz chipsets, it is still a difficult technology and power consumption is still high, limiting its use in battery-powered devices. As semiconductor processes improve and costs come down, we will eventually see this technology come into its own.

When There’s No Spectrum Left

Three words: free space optics (FSO). When we run out of radio spectrum, we can still resort to the optical spectrum, including infrared (IR) light. There are no regulations or licensing either.

IR is already widely used in wireless transmissions. TV remote controls and some IR backhaul equipment are readily available. And remember IrDA, the short-range standard popular for a while in laptops and PDAs? It’s still a viable option for some applications.

FSO equipment has limited range affected mainly by rain, fog, dust, and other atmospheric effects. Using high-power laser transmitters in the 785- to 1550-nm range, data rates from 10 Mbits/s to well over 1.5 Gbits/s are possible at a range to almost 10 km (6.2 miles). Also, 10-Gbit/s Ethernet systems are available to reach several hundred meters.

FSO is ideal for short links between buildings to avoid cabling costs. It’s currently used to link or extend networks using Ethernet, Sonet/SDH, T1/E1, ATM, or other common standards.

So What Is 5G?

We will be in the 4G era for a long time yet. Carriers are still rolling out their so-called 4G (ITU 3G) systems. Once IMT-Advanced is finally ratified, it will take years and a huge investment before the first LTE-Advanced or WiMAX2 systems get into place. Only then will work begin on the next generation.

But if we get full 4G with all its capabilities including the streaming of HD video, will we need 5G? We may not, but you know we will get it anyway. According to Rambus director of logic design and verification Ely Tsern, 5G is probably more of an evolution of 4G with minor improvements in coverage and reliability as opposed to higher data rates.

With real 4G giving us from 100 Mbits/s to 1 Gbit/s in the years to come, we may not need much more for most mobile applications. The OFDM technology has reached its peak spectral efficiency (15 bits/Hz/Hz) with IMT-Advanced, and there isn’t much more to wring out of it.

5G will improve on coverage with more smaller cell sites like femtos and picocells and upgraded backhaul to ensure we can get the most out of the whole system. Tsern also sees some fixed/mobile convergence where multiple wireless services like 4G can be integrated with Wi-Fi and perhaps other networks to provide even better coverage and more intelligent utilization of existing resources.

Based on current trends, it’s possible to predict what 5G may be like. First, given the ongoing spectrum shortage and crisis, higher-level modulation and coding schemes can help in the interim with higher speeds. Adaptive beamforming antenna techniques will also help access and speeds.

Wi-Fi will continue to serve as a more important offload mechanism to alleviate network congestion and increase speed, but it’s inevitable that systems will move to the higher frequencies. Millimeter-wave systems will emerge. Smaller, shorter-range basestations like picocells and femtocells will become commonplace. Mesh networking and repeaters will also play a role. A 5G standard is at least a decade away.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish