Electronicdesign 3981 Xl mcdonall150x155
Electronicdesign 3981 Xl mcdonall150x155
Electronicdesign 3981 Xl mcdonall150x155
Electronicdesign 3981 Xl mcdonall150x155
Electronicdesign 3981 Xl mcdonall150x155

Heterogeneous Computing And IP-To-The-Pin Will Drive Test In 2011

Dec. 9, 2010
NI's Richard McDonell looks to 2011 and sees multiple computing architectures pervading the test bench, as well as FPGAs being used in both DUTs and test systems to improve system-level test.

Keeping up with the latest technology trends and predicting the future is hard. Fortunately, National Instruments has a unique vantage point by supplying more than 30,000 electronics companies each year. We also stay up to date through our internal R&D and biannual technology exchanges with our key suppliers and top customers to get their outlooks on upcoming technologies.

Looking ahead to 2011, the two leading trends in test will be the use of multiple computing architectures in a test system, also known as heterogeneous computing, and applying higher-level software abstraction tools to implement advanced IP-to-the-pin for FPGA-based reconfigurable instrumentation. Additional trends will include a focus on achieving increased organizational test integration and investing in proper tools and architectures for designing flexible system software stacks.

Heterogeneous Computing
Automated test systems have always consisted of multiple types of instruments, each best suited to different measurement tasks. This same trend is now affecting how we perform computation in a test and measurement system. Applications like RF spectrum monitoring, for example, require inline, custom signal processing and analysis not possible using a standard PC CPU.

To address these needs, engineers will have to turn to heterogeneous computing architectures to distribute processing and analysis among different computing nodes. The most common computing nodes in test systems are central processing units (CPUs), graphics processing units (GPUs), FPGAs, and cloud computing.

While heterogeneous computing provides new and powerful computing architectures, it also introduces additional complexities in the development of test systems—the most prevalent being the need to learn a different programming paradigm for each type of computing node. For instance, to fully utilize a GPU, programmers must modify their algorithms to massively parallelize the data and translate the algorithm math to graphics rendering functions.

Meanwhile, FPGAs often require the knowledge and use of low-level hardware description languages like VHDL to configure specific processing capabilities. Fortunately, work is underway to abstract the complexities of specific computing nodes so heterogeneous computing can enable many new possibilities in test system development.

IP-To-The-Pin
Moore’s Law is bringing FPGA capabilities in line with ASICs. In fact, the Gartner research firm stated in a 2009 report that FPGAs now have a 30 to 1 edge in design starts over ASICs. This performance boost and the empirical advantage of being software-defined have created a market shift toward FPGA-based designs for both electronic devices and test instrumentation.

This common programmable core is enabling engineers to deploy design building blocks, known as intellectual property (IP) cores, to both their device under test (DUT) and reconfigurable instruments. The IP cores include functions/algorithms such as control logic, data acquisition, digital protocols, encryption, math, signal processing, and more.

The ability for test engineers to directly embed the design IP in their test instrumentation to perform system-level test can dramatically shorten design verification/validation and improve production test time and fault coverage. This capability is called IP-to-the-pin.

Moore’s Law will continue to accelerate this trend by providing more powerful FPGAs. Vendors are also beginning to integrate FPGAs with devices such as processors and data converters to deliver more performance and user programmability even closer to the pin.

Continue on next page

Another element of this trend is the increase in availability and capability of high-level synthesis tools (HLS), such as NI LabVIEW FPGA, for test engineers. This abstraction increases the accessibility of FPGA designs to more engineers and provides a platform for programming at a system level.

Organizational Test Integration
For the past two decades, organizations have sought to improve the performance of test teams in design and production by drawing clear boundaries around these groups and allowing them to improve independently. However, this strategy has started to generate diminishing returns. To keep up with the increasing time-to-market and cost pressures of next-generation products, companies are now looking to organizational test integration.

Best-in-class companies are integrating test organizations in design and production to decrease test development time, reduce costs, and improve quality. For example, by improving the new product introduction process and involving production test earlier in design, organizations can develop test systems faster and reduce time-to-market.

In addition, the increased use of test automation in design and production has shown both teams that common software and instrumentation platforms can be used across the organization, reducing capital and training costs. Finally, test teams are developing reusable software components that not only reduce development time but also increase quality by providing more reliability and repeatability of measurements.

System Software Stack
With the increasing role of automated test software during the past decade, today’s industry-leading companies are putting greater emphasis on designing more robust system software stacks to ensure maximum longevity and reuse. Most companies are moving away from monolithic test applications that contain fixed-constant code and direct driver access calls to the instruments. Instead, modularity is achieved in the form of separate yet tightly integrated elements for test management software, application software, and driver software.

Two key technology components gaining increased usage are process models and hardware abstraction layers. The process model plays a primary role in separating all of the test steps in a sequence from the non-test tasks, such as reporting, database logging, importing test limits, and DUT tracking. Hardware abstraction layers separate the test application software from the instrument hardware, which minimizes the time and costs associated with migrating or upgrading test systems.

As the test industry continues to keep up with the innovative advances of DUTs, you must constantly evaluate the impact it has on your test technologies and methods. Applying these trends for 2011 to your test strategy will help you keep your organization ahead of the industry and enable you to keep up with the growing demands on your business.

Sponsored Recommendations

Highly Integrated 20A Digital Power Module for High Current Applications

March 20, 2024
Renesas latest power module delivers the highest efficiency (up to 94% peak) and fast time-to-market solution in an extremely small footprint. The RRM12120 is ideal for space...

Empowering Innovation: Your Power Partner for Tomorrow's Challenges

March 20, 2024
Discover how innovation, quality, and reliability are embedded into every aspect of Renesas' power products.

Article: Meeting the challenges of power conversion in e-bikes

March 18, 2024
Managing electrical noise in a compact and lightweight vehicle is a perpetual obstacle

Power modules provide high-efficiency conversion between 400V and 800V systems for electric vehicles

March 18, 2024
Porsche, Hyundai and GMC all are converting 400 – 800V today in very different ways. Learn more about how power modules stack up to these discrete designs.

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!