Effective high-speed digital product development begins with sophisticated design-automation tools and ends with high-performance test equipment that can ensure that fully verified, debugged, and characterized designs are ready for manufacturing. That’s my takeaway from DesignCon, held Jan. 28-31 in Santa Clara.
In a Jan. 29 keynote address, Jonah Alben, senior vice president of engineering at NVIDIA, discussed the design-automation aspects. To the layman, he said, technology advances may seem to be driven solely by Moore’s Law, but EDA plays a key role as well. If we were still using the design tools of 30 years ago, he said, we would not be able to make use of all the transistors Moore’s Law lets us build.
Yet it might not just be the laymen that put too much emphasis on Moore’s Law. “Despite the value of EDA,” Alben said, “companies tend to under-invest in it, and that includes my company as well.” To avoid complacency, companies must invest in simulation and visualization, signal-integrity and electromagnetic analysis, Spice simulation, and computational lithography.
To ensure adequate investment, Alben recommended that engineers defend their productivity, define a long-term investment strategy, choose important near-term investment goals, focus on improving methodology with every project, allocate a budget for methodology staffing, and involve product engineers.
In an exclusive interview at DesignCon, Jay Alexander, vice president and general manager of Agilent Technologies’ Oscilloscope Products Division, said high-speed digital markets are driven by key megatrends: the rise of Asia and developing nations as design centers, the increasing dominance of mobile and cloud computing in the IT industry, and the need for energy efficiency.
Innovations, he said, center on faster processors operating on lower power, higher performance internal buses, and advanced memory components. Voltage levels, he said, can be so low that it’s very difficult to make a measurement that doesn’t affect the signal of interest.
Test equipment, Alexander said, must evolve to meet the needs of designers who are combining multiple bus technologies (with PCIe over SATA implementations, for example), and simulation and modeling tools must evolve to support 3-D stacking and through-silicon vias. Agilent, he said, offers a range of products that support design and simulation, analysis and debug, and compliance test.
In a keynote address on Jan. 30, Mike Santori, business and technology fellow at National Instruments, traced the evolution of instrumentation and described a complementary evolution in standard commercial off-the-shelf electronic products. In the 1980s, he said, NI offered GPIB boards to support test automation via personal computers. The ‘90s saw the introduction of LabVIEW and the concept of virtual instrumentation where the computer becomes the test platform.
As instrumentation evolved, so too have COTS electronics, beginning with the vacuum tube, which Santori called the first COTS electronic component. Progress continued through the transistor and integrated circuit, and the evolution continues today in accordance with Moore’s Law, which drives not just CPU power but memory and graphics functions as well. Today, Santori said, we take computer power for granted. A cellphone has evolved from a simple communications device into a full-blown general-purpose computer.
And as transistors proliferate, so too does software, making test and measurement significantly more challenging as engineers face the task of designing and testing multiprotocol devices. He cited “software-designed instrumentation” as a technique that can help designers deal with complexity—such instruments give customers access to internal FPGAs.
Today’s design challenges require innovative thinking, Santori said, with software-designed instruments changing the rules. He concluded with some good advice: “Think beyond your current tools and approaches.”
Executive Editor
Visit my blog: http://bit/ly/N8rmKm