Boen

Datacenter growth, cloud service providers turn up the heat on test and measurement

April 3, 2018

Cloud service providers (CSPs) such as Amazon, Google, Microsoft, and Apple to name a few are driving big changes in the wired communications technologies used to power datacenters. Unlike a few years ago when the component manufacturers were driving technology development, the CSPs are now actively involved in standards groups and pushing hard to drive key standards forward—all with an eye toward improving bandwidth and lowering power and cost requirements. This is turn is putting big pressure on designers and test-and-measurement vendors alike to continually innovate to keep pace.

Wired communications encompasses the technologies that are driving the datacenter ecosystems. These technologies are either over copper or optical interfaces and are used as network interfaces or within a server or storage platform. Wired communications also extends beyond the datacenter and includes consumer electronics and the wired interfaces within automobiles, for example.

With the rapid proliferation of connected devices and sensors, we’re dealing with more data than ever before, and that is driving the need for faster bandwidth within and outside of datacenters and in consumer products, including cars. The data must be stored and accessed quickly and reliably. In many applications the data is processed multiple times within the datacenter. These exponential increases in data analysis are driving new technologies across the ecosystem and must be balanced with the need to keep everything low-cost and low-power.

These larger industry trends make wired communications an exciting segment since designers and standards bodies are really pushing the limits of what’s possible. A few years ago, roadmaps for these technologies were clear. In today’s environment the implementation details of those roadmaps are less clear, but the goal remains the same. The goals center around the need to increase bandwidth while at the same time lowering cost and power consumption. PAM4 and other complex modulation formats are new ways to increase bandwidth without increasing bit rates. The tradeoff, of course, is increased complexity.

Tracking the trends

A big trend that we are seeing is a change in who’s driving the standards. The CSPs are now at the forefront and working closely with standards bodies. The big players are active participants in the standards bodies—they’re the ones that are defining the next generation technologies that will build the networks and solutions that are within the datacenter. In some cases, they’re on the board of directors for these consortiums to drive the technology forward, and we’re also seeing more multi-source agreements, or MSAs. MSAs are founded by a group of companies with the goal of driving technology faster than the historical pace of traditional standard bodies.

New standards and interface technologies still won’t be enough to keep pace with the large data demands. Because of this, there’s also a move to edge computing: datacenters will be closer to the access points and will be smaller in scale. This will be especially critical for so-called hot data—that is, data that needs to be acted upon quickly.

As cost and power consumption is reduced, it’s clear that coherent technology will eventually move into the datacenter. The focus on power reduction may also lead to the elimination of the electrical interface between a switch and a transceiver enabled by on-chip optics. It is also possible that we will see traditional technologies bifurcate. For one application, NRZ may be a better choice, while in others it may be PAM4 or electrical vs. optical. Datacenters will experiment with both and pick the right choice for a given application. This will in turn change the ecosystem in terms of the players, and change the type of testing required.

New design challenges

The interplay of new standards, approaches, and technologies put more pressure on test and measurement. Design complexities are greater than what they were in the past, and engineers will need to run more tests to make sure that a product is ready for deployment. As technology becomes more complex, there will be new challenges in interoperability and a greater focus on making sure components work together in specific applications. This is a shift from proving compliance to a standard and the assumption if two components are compliant that they will interoperate.

Another key challenge facing manufacturers is the uncertainty of which technology to use. There’s a sheer lack of clarity about what technology is going to work, and so engineers are implementing multiple technologies. If you look at roadmaps, you’ll see 28/56 Gbaud PAM4, but also 56 Gb NRZ, which is the traditional singling methodology. Network equipment suppliers are doing both, because they’re not exactly sure what’s going to work best, and which signaling technology might work better in one application versus another.

With trends changing from NRZ to PAM4 and from electrical to optical, and uncertainty about which is best for a given application, testing can quickly become an added complication.

The growing role of test in enabling innovation

As datacenters and datacenter technology changes, the methods used to measure designs must change as well. From new modulation schemes to new interconnect protocols, from expanding datacenters to optical links, to big data and the changes in how standards are developed, one aspect of wired communication remains constant: testing will continue to be a critical factor in the success of enabling next generation technology.

Test and measurement is also playing a critical role in enabling the move to new modulation technologies, such as PAM4. These modulated signals have lower signal-to-noise ratio and one third of the amplitude of equivalent NRZ signals. This means engineers must deal with small, noisy signals. The evaluation of PAM4 signals requires more sophisticated tools and features to evaluate them successfully. A state-of-art feature set, advanced triggering and sophisticated debug capabilities are required to confidently troubleshoot and measure PAM4 signals.

Time to market is critically important for engineers developing wired communications technologies for use in datacenters.  Automation and advanced debug tools are necessary to bring designs to market quicker and reduce costs. These tools also allow engineers to overcome problems both from a compliance perspective and an interoperability perspective using fully automated acquisition and analysis tools and debug at both physical and protocol layers.

For further reading

Tektronix demonstrates new optical modules, probe at OFC,” EE-Evaluation Engineering Online, March 13, 2018.

About the Author

Sarah Boen

Sarah Boen is general manager, Wired Communications, at Tektronix. Her 19-year tenure at Tektronix has included such roles as product marketing manager and planner, program manager, and software design engineer. She has an MBA and BSCS from the University of Portland.

Sponsored Recommendations

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!