There has never been a more dynamic time in the electronics industry. In addition to refining design and test strategies to emerge from the Great Recession, the engineering community is driving rapid advances to satisfy the demands of the Information Age.
Among all of the uncertainty we are experiencing today, one thing remains certain: new ages of modernization are good for improving productivity and driving new technologies to price points that can be applied across many industries.
The PC Age demonstrated this point amid the recession of the late 1970s and early 1980s, as many innovations significantly improved the productivity of society and lowered the cost of business. The era drove dramatic changes such as virtual instrumentation, which used a PC-based architecture with graphical data flow programming. It also brought about plug-in measurement cards to provide a lower cost of test and higher performance than equivalent traditional instruments of similar I/O specifications.
Today, the Information Age is driving innovation in three significant areas with even greater promise for advancing automated test systems: parallel processing, high-speed data transfer, and software-defined devices.
CHANGES FOR AUTOMATED TEST
The underlying technology advances stemming from these innovation areas include multicore processors, FPGAs, PCI Express, and user-defined software tools for programming commercial off-the-shelf (COTS) hardware. The potential impact of these areas and the underlying technology advances applicable for automated test indicate some major changes are in store for the automated test industry.
A major technology shift for test and measurement is parallel processing using a combination of multicore processors and open, user-defined FPGAs. Multicore processors continue to demonstrate their growth in parallel processing ability versus the traditional single-core processor as more cores become available on a single chip. In fact, quad-core processors are already common in many PCs, and octal-core processors are available on server-class machines.
Multicore processors are ideal for handling computational intensive processing commonly found in test and measurement applications by distributing the processing tasks across multiple cores in parallel. In some applications, test engineers have seen up to a 100% or higher increase in throughput as a result of using multicore processors in their applications.
Like multicore processors, FPGAs are the ultimate in parallel processing and have become ubiquitous in today’s smart devices. Their widespread use has helped lower the cost of applying the parallel processing technology throughout many industries, including T&M.
User-defined FPGAs embedded in COTS instrumentation enable a new level of capability for engineers to run their processing and analysis algorithms in real time at hardware speeds. This offloads processing from the host PC, allowing other tasks to utilize the processor. It also permits test engineers to perform more sophisticated user-defined algorithms on the FPGA for high-speed closed-loop systems, protocol-aware communications test, and advanced system control applications. Expect to see major growth in user-defined FPGAs in automated test applications over the next three years.
THE PCI EXPRESS STANDARD
The need for higher-speed data transfer via circuits, cables, and wireless connections is constantly growing. Therefore, it’s no surprise that this trend maps directly to the high-speed data transfer needs among today’s test systems responsible for testing the information transmitting devices.
Fortunately, the PCI Express high-speed data standard provides an efficient COTS-based approach. The high-speed serial PCI Express bus allows for up to 1 Gbyte/s of theoretical bidirectional data transfer per instrument based on a Gen 1 x4 data link. This is a major boost in bandwidth compared to the traditional 1-Mbyte/s and 132-Mbyte/s general-purpose interface bus (GPIB) and PCI instrumentation buses, respectively.
PCI Express also offers better latency specifications than other bus options in GPIB, Ethernet, and USB. This is critical for high-throughput, low-latency automated test systems in many communications, aerospace and defense, semiconductor, and consumer applications.
Another benefit of PCI Express is the fact that backward-compatible Gen 2 and Gen 3 versions of the standard are already in development to further increase data transfer rates well beyond 2 Gbytes/s per instrument using a x4 link. Instruments requiring even higher data bandwidth can use x8 or x16 data links to double and quadruple their data throughput.
PCI Express can also be used across a copper or fiber-optic cable for distributed connectivity between computers in multi-computing applications. PCI Express provides peer-to-peer capabilities for routing high-speed data between two devices on a PCI Express bus. This is critical for many next-generation test systems since it allows for the efficient use of FPGA processing modules to serve as user-defined co-processors in a PCI Express-based automated test system.
PCI Express performance can be put to use today using PXI Express-based systems. PXI Express is an extension of the PXI industry standard that represents a ruggedized, CompactPCI-based form factor with additional timing and syn-chronization integrated into the system backplane. It also allows for the combined use of both PXI and PXI Express modules in an automated test system to maximize reuse and cost effectiveness. PXI is expected to see continued double-digit growth in adoption for automated test systems given its ability to provide solutions using PCI Express, multicore processors, and user-defined FPGAs today.
INVESTING IN SOFTWARE-DEFINED DEVICES
The rise of software-defined devices is the final significant area of investment for automated test. The growth of software-defined devices, leveraging vendor and user-defined apps, can be seen everywhere from smart phones to instrumentation. The rapid rise in popularity of providing an open-platform experience using a COTS-based hardware device is causing a major shift in the need for greater software abstraction and embedded software development tools that do not require advanced programming knowledge.
This is compounded by a similar need to abstract the difficulty of using traditional text-based development tools for programming multicore processing applications. National Instruments has been combining these models of computation into a software development environment ideal for test and measurement engineers for many years.
Given the graphical data flow paradigm of NI LabVIEW, engineers can begin developing multicore, embedded FPGA, and software-defined instruments with minimal training and no formal computer science or VHDL programming knowledge. This is a breakthrough for test engineers because it lets them directly apply the technologies of the Information Age in their systems today.