Electronicdesign 7907 0115commpromo

Communications Forecast: Top 10 Communications Trends to Watch in 2015

Dec. 23, 2014
Evolutionary leaps in the communications arena seem to be an annual event. “Softwarization” and a ramped-up IoT could be two of this year’s paradigm shifters.

Communications represents the core of electronics.  The technology traces back to the dawn of electronics, and still dominates today by weaving its way into nearly every facet of modern life. However, its immense impact on our lives is generally taken for granted.  With that in mind, pay attention to these key communications trends in the coming year:

Continued Expansion of LTE

Long Term Evolution 4G cellular standards, well established in the U.S. and Asia, haven’t hooked on everywhere. Many locations domestically and in most developing nations still rely mainly on 3G technology. In Europe, for example, LTE penetration is only 14%. With the smartphone now the de facto standard handset, there’s ever-growing demand for broad and fast LTE coverage. In response, U.S. carriers continue to expand their LTE offerings. Its growth, though, often slows due to lack of available capital and suitable frequency spectrum. 

Download this article in .PDF format
This file type includes high resolution graphics and schematics when applicable.

The next big step for LTE involves LTE-Advanced and small cells. LTE-A leverages carrier aggregation and higher MIMO levels to widen bandwidth and boost speed. Modem chips for the LTE-Advanced (Release 10) versions of LTE are just now emerging, so look for LTE-A cell sites and handsets to arrive in… 2016 (most likely not this year). Then download speeds up to 300 Mb/s will be possible under ideal conditions. 

While carrier aggregation will boost bandwidth to increase speeds, the lack of spectrum will still limit LTE-A. It’s going through trials and testing now around the world with almost no commercial activity. Carriers like AT&T, T-Mobile, and Verizon are currently preparing for real service in the coming years as chips and other equipment become available.

Despite the capital and spectrum limitations, LTE will still grow significantly. ABI Research estimates that 676 million LTE handsets will ship in 2015, a 50% increase over 2014. Furthermore, ABI estimates 1.89 billion LTE-enabled devices will be in use by 2019. Thus, the burden gets placed on carriers to expedite the LTE infrastructure to support that quantity.

As the LTE infrastructure expands, some of that growth will entail small cells—miniature base stations with limited range and power.  These small cells will add to the existing cellular base-station mix to create a heterogeneous network (HetNet) that should boost coverage indoors and out, as well as increase downlink speeds. Few small cells have been installed, but look for gradual deployment in dense population areas. 

Defining 5G

Even with 4G technology still in expansion mode, fifth-generation (5G) cellular systems are already being defined.  We will linger a while longer in the 4G world as LTE continues to expand, LTE-A comes online, and small-cell efforts like Wi-Fi offload and distributed antenna systems (DAS) are implemented. Nonetheless, debates about 5G’s future are underway, with further definitions expected this year.

As usual, the goal is to expand capacity, fill in the coverage gaps (especially indoors), ease the spectrum shortage problem, and increase downlink speed. The emerging consensus is that small cells in the millimeter-wave bands can do the job.  While physics restricts the range of millimeter-wave signals, high-gain antennas and many small cells should make it workable.  The 28-, 38- and 73-GHz millimeter bands have been proposed. 

While OFDM may be used, newer modulation methods could be part of the new standards. Steerable beamforming antenna arrays and high levels of MIMO will allow gigabit speeds in dense urban surroundings. Millimeter-wave backhaul will connect everything together. Overall, we’re years away (2020?) from 5G, but be on the lookout for ongoing discussions on its progress.

Eternal Ethernet

Ethernet, the ubiquitous local-area-network (LAN) technology, has been with us for over 40 years and continues to morph to keep pace with changing technology. Ethernet has been on the path of increasing line rates by a factor of ten every few years since its beginning. The original 10 Mb/s soon became 100 Mb/s, then 1 Gb/s, and onto 10 Gb/s. Today, Ethernet delivers 100 Gb/s in copper as well as fiber forms.

Lately, though, we’re seeing a different path taking shape in terms of speed. Instead of reaching for the next decade of 1 terabit per second (1 Tb/s), IEEE 802.3 task forces are targeting 400 Gb/s and even lower-level intermediate speed versions. The idea is to adapt Ethernet to specific needs and niches. A great example is the proposal for 2.5G and 5G versions of Ethernet. This effort is spurred on by the NBASE-T Alliance, an organization dedicated to promoting and developing the 2.5G and 5G versions.

One projected problem is that the LAN infrastructure needs to support the forthcoming 802.11ac Wave 2 wireless hotspots capable of multi-gigabit speeds. With most access points stuck at 1 Gb/s, existing cable installations can’t handle the extra speeds achievable with the wireless access points. The most common CAT5 and CAT6 cabling (Fig. 1) installations don’t support the 10-Gb/s version of Ethernet, so the need arises for another solution. We can look forward to some new versions of Ethernet with modulation methods that can handle 2.5G and 5G speeds on standard unshielded twisted pair up to 100 meters.

1. Ethernet’s familiar CAT5/6 cables and connectors will continue to provide the links for new variations of this ubiquitous LAN standard (courtesy of the University of New Hampshire Interoperability Laboratory (UNH-IOL).

Another similar effort now underway comes via the 25G/50G Ethernet Consortium. This group wants new versions of Ethernet that will run at 25G and/or 50G on copper cables and backplanes. The impetus is to provide lower-cost interconnections of servers and storage units in data centers to support growing cloud, video-data and wireless-traffic needs.

We needn’t worry about Ethernet. Forthcoming new versions for 2.5G, 5G, 25G, 50G and 400G will keep us happy in the years to come.

New Short-Range Wireless Options

Now almost two decades old, Bluetooth and ZigBee have each carved out a niche in the short-range wireless market. Bluetooth became successful with its wireless headsets, hands-free automobile kits, and wireless speakers. Basically, it’s in every smartphone. ZigBee made great strides with home automation, industrial mesh sensor networking, and remote controls. They rarely competed with each other. That could all change, though, given the new versions of these technologies aimed at the Internet of Things (IoT) market.

The Bluetooth SIG’s latest version 4.2 improves privacy and security, boosts data speeds, and adds Internet connectivity. The greater security ensures that users of beacons can’t be tracked. Data speed jumps by an x2.5 factor over the 4.1 version, improving capacity. In addition, Internet connectivity via IPv6 and 6LoWPAN now make Bluetooth a candidate for IoT applications. And don’t forget Bluetooth Low Energy is continuing to penetrate into beacons and wearable products. Bluetooth being combined with near-field communications (NFC) for seamless pairing makes it even more popular in consumer devices.

ZigBee’s new 3.0 version has become more attractive than ever, too—it now combines all features of many ZigBee applications into one specification. That includes home automation, lighting, energy management, security, sensors, and healthcare monitoring (Fig. 2). Based on the popular ZigBee PRO specification, version 3.0 is still a great choice for IoT mesh networking and Internet connectivity.

2. ZigBee’s new version 3.0 will make this standard ever more popular for the Internet of Things in terms of home monitoring and control (courtesy of the ZigBee Alliance).

Needless to say, IoT developers now have two more excellent choices at their disposal.

NFC Progress…At Last

Near-field communications, another nearly two-decades-old short-range wireless technology, doesn’t compete with Wi-Fi, Bluetooth, or ZigBee. Rather, it’s found a niche or two in transit payment, secure entry, and posters. Its biggest challenge has been to become the wireless payment method in smartphones, replacing or at least supplementing credit-card payment methods. It was incorporated into some Android smartphones to implement payment schemes like Google Wallet and others. Overall, though, adoption by retailers and consumers was poor.

However, Apple put NFC into its new iPhone 6 models and implemented the Apple Pay system, which seems to have re-ignited interest in NFC and smarphone pay methods. In fact, LitePoint’s VP Curt Schmidek has seen a boost in sales of its IQnfc NFC production test units (Fig. 3).  He believes that NFC will come into its own this year. With the shift in liability for corrupt hacking charges shifting from bank insurers to merchants, retail outlets should finally invest in NFC payment terminals simply because the systems offer far more security. With the transition expected to happen late this year, look for 2016 to be the year of NFC and increased use of smartphone pay systems.

3. IQnfc, developed by LitePoint, is a production line tester for NFC wireless devices. Shipping volume of the tester has spiked thanks to increased NFC usage in smartphones.

Improved Wi-Fi, and More of It

Where would we be without Wi-Fi?  It’s almost ubiquitous, with expectations of it being everywhere and free.  This year will see even more Wi-Fi with greater speeds, improved coverage, and even new uses.

First, adoption of the latest 802.11ac standard should be substantial. It has taken some time for this faster 5.8-GHz-only version to come on line, but with new chips available along with improved routers and access points (APs), look for faster links everywhere. Use of 80- or 160-MHz-wide channels and modulation methods up to 256QAM has boosted data rates into the gigabit region.

You will also see the emergence of 802.11ac Wave 2 products. These offer multiuser multiple-input multiple-output (MU-MIMO), which lets one access point handle more than one user at a time. This will provide greater access as well as near-gigabit data rates.

What’s not so clear is the path for the superfast 802.11ad version of Wi-Fi, known as WiGig. This 60-GHz-band wireless standard with active high-gain beam-forming antennas offers speeds to 7 Gb/s over short distances. Its most likely application is uncompressed video transfer. Chips are now available, but few end products. What will 2015 bring?

Major WLAN provider Ruckus Wireless offers up a variety of predictions for Wi-Fi. In addition to its “2015 is the year of 802.11ac” forecast, Ruckus expects wider adoption of the Wi-Fi Alliance’s Hotspot 2.0 (Passpoint) with the 802.11u standard, significantly improving the ability of Wi-Fi users to automatically connect to an AP and roam seamlessly from one AP to another. Another major prediction concerns the incorporation of virtualization into the WLAN via network function virtualization (NFV).

Ruckus also looks for wireless operators to increase use of Wi-Fi offload in order to increase capacity. Wi-Fi-calling, or VoIP over Wi-Fi, continues to gain popularity—it allows voice calls from cell phones from locations unreachable with a cell site, but with a nearby Wi-Fi access point.  This very popular feature could kill off the femtocell solution for poor cellular coverage. In addition, Wi-Fi offload should slow the adoption of LTE small cells. While small cells will soon become common, many of those small cells will be Wi-Fi access points.

Finally, we may see some progress in extended-range Wi-Fi with coverage out to one kilometer. One development called White-Fi, or Super Wi-Fi, involves application of the 802.11af standard to the white-space spectrum. White space represents the unused TV channels in the 54- to 790-MHz VHF and UHF range. These 6-MHz-wide channels permit longer-range communications than the current 2.4- and 5.8-GHz bands of Wi-Fi. Using cognitive radio methods, the 802.11af standard will greatly extend the range of Wi-Fi in rural areas and those with difficult terrain. 

Another longer range option is the 802.11ah standard designed for the 902- to 928-MHz U.S. unlicensed band. With channel bandwidths to 16 MHz, it can potentially deliver rates topping 300 Mb/s over its extended range.

The Internet of Things Becomes Reality

Connecting everything to the Internet seems like a nutty idea. Yet, the idea of linking “things” to other “things” and/or to humans via the Internet has not only generated lots of interest, but a surging number of new products. With Internet security becoming an increasing problem, you have to wonder about the implications. Yet new applications will produce substantial benefits in terms of convenience and time/cost savings.

A spokesperson for Broadcom Corp. offers this view of IoT:  “IoT is a significant growth engine as it has the ability to connect anything. Today, there are approximately 1.75 billion smartphone users worldwide and an average of five to seven connected devices per home. Experts predict that by 2020, 50 billion devices and objects will be connected to the Internet. What’s perhaps most exciting about this market is the low barrier for entry. There’s an opportunity for anyone with a brilliant idea to quickly test ideas and bring them to market. For example, Broadcom offers a $20 Wireless Internet Connectivity for Embedded Devices (WICED) development kit, called WICED Sense, that give developers access to the technology needed to quickly and affordably test an idea and create a prototype. “ 

IoT will exploit a variety of wireless methods. Broadcom’s WICED and Texas Instruments’ SimpleLink products are two examples. But with new standards, Bluetooth and ZigBee also become candidates for IoT products. Furthermore, cellular remains an option for some applications.

A Crippled Internet

The government wants to regulate the Internet. Though it’s tried to do so on several occasions only to lose in court, the people in power aren’t giving up. The latest efforts include FCC Chairman Tom Wheeler’s hybrid plan and President Obama’s suggestion to categorize the Internet as a utility under Title II of the Communications Act of 1934. 

The idea is to keep the Internet open and fair. Of course, today the Internet is open and basically fair without regulation. Some argue that the carriers will throttle the Internet and raise fees unfairly. Others argue that regulation is necessary to ensure that all traffic of large and small organizations be treated the same.

While some regulation seems practical to thwart potential abuses, the effect of any regulation always seems to raise taxes and costs, as well as put restrictions on innovation and investment. So, it appears as though some regulation is inevitable. No decision has been made yet, but look for a resolution sometime during 2015.  Hopefully it will be reasonable and not cripple the best technological development of the century.

On top of the predictable forthcoming regulation, another possible government decision threatens to further cripple the Internet. In September, the U.S. Commerce Department’s contract with the Internet Corporation for Assigned Names and Numbers (ICANN) will end, terminating the U.S.’s control over the Internet. The current administration wants to turn control over to some international organization.

With individual governments able to make decisions on the Internet, we can possibly expect to see severe restrictions on a country by country basis. ICANN has tried to stay fair and neutral under the U.S. influence. Let’s hope the government changes its mind and keeps ICANN in its fold. Otherwise, 2015 could be a disaster of a year for the Internet.

Spectrum Shortage Solutions

Some claim that there‘s no spectrum shortage, but the wireless carriers say otherwise. The lack of usable spectrum is the main limitation to boosting 4G capacity and speed. A couple of approaches are being taken to solve this problem. The first revolves around spectrum auctions—a recent spectrum auction generated over $40 billion. The FCC freed-up space in the AWS-3 spectrum in the 1700- and 2100-MHz range for cellular usage, most of it going to major carriers like AT&T, Verizon, T-Mobile, and Dish Network.

More auctions are on the way.  In 2016, the FCC will ask TV broadcasters to voluntarily give up their channels and either go out of business or move to another location. This spectrum is worth billions and many TV station owners are seriously considering it. With only about 10% of the population getting over-the-air TV, and with increased competition from cable and Internet TV, it’s expected that some stations will participate. The FCC hopes to free up an additional 100 MHz and generate about $45 billion in the process.

Spectrum sharing is another technique employed to ease the spectrum crunch.  The best example is the 2.4- to 2.5-GHz unlicensed band used by Wi-Fi, Bluetooth, ZigBee, and even some cordless phones.  It works because of the limited range, low power, and some special co-existence techniques. 

We can look for more spectrum sharing that leverages cognitive-radio techniques. The TV white-space spectrum now uses these for unlicensed data transmission.

Download this article in .PDF format
This file type includes high resolution graphics and schematics when applicable.

Another solution is to move to the higher frequencies in the millimeter-wave bands.  These frequencies are already seeing increased usage as the latest semiconductors emerge to make them practical.  You will no doubt see cellular systems adopt the millimeter bands for 5G.

The “Softwarization” of Networking

So far, networking has been a balance of hardware and software.  Now, though, we’re seeing a gradual shift to the software side. With the meteoric rise in Internet traffic (especially video), growing cloud needs, and the potentially overwhelming Internet of Things, current networks find it harder to scale systems to handle the capacity and speed demands. The proposed solution is a new software-based paradigm for networking.

One solution—software-defined networking (SDN)—replaces some hardware with software. SDN separates the data and control planes of the network. The Open Networking Foundation (ONF) defines SDN as “an emerging network architecture where network control is decoupled from forwarding and is directly programmable.” OpenFlow is one example of SDN-related commercial software.

Another software approach is the aforementioned network function verification, or NFV. Its goal is to migrate network operations from dedicated hardware to multiple virtual machines running on common servers. This should lead to improved utilization of existing resources while lowering costs. Expectations are that NFV will be used in conjunction with SDN to further improve network performance and efficiency.

SDN and NFV have yet to be widely implemented, but the movement is underway and migration will take years.

Sponsored Recommendations

Near- and Far-Field Measurements

April 16, 2024
In this comprehensive application note, we delve into the methods of measuring the transmission (or reception) pattern, a key determinant of antenna gain, using a vector network...

DigiKey Factory Tomorrow Season 3: Sustainable Manufacturing

April 16, 2024
Industry 4.0 is helping manufacturers develop and integrate technologies such as AI, edge computing and connectivity for the factories of tomorrow. Learn more at DigiKey today...

Connectivity – The Backbone of Sustainable Automation

April 16, 2024
Advanced interfaces for signals, data, and electrical power are essential. They help save resources and costs when networking production equipment.

Empowered by Cutting-Edge Automation Technology: The Sustainable Journey

April 16, 2024
Advanced automation is key to efficient production and is a powerful tool for optimizing infrastructure and processes in terms of sustainability.

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!