Test Manufacturers Fight Fire With Fire
Automatic test equipment (ATE) is a huge umbrella-like term sheltering device test, board test (both unloaded and loaded), and system test. Rising silicon transition speeds and higher data rates are demanding ATE to stay abreast of the products to be tested. But mixed-signal devices, plus the proliferation of chips with RF signals, cell phones, Bluetooth, and the like, in the 2-GHz domain are particularly troublesome for device testers.
As the saying goes, you have to fight fire with fire. Device test manufacturers are doing just that, devising mixed-signal solutions, in some cases right at the probe points, to provide the necessary interfaces. Then too, there are the issues of shrinking packages, with some discrete components that mount on boards just 1 by 2 mm in size.
Couple all of that with surface-mount attachment, which makes for more difficult interfacing of flip chips because the interface is totally out of view, and it's clear why board testing has become such a major hurdle for test engineers. So it's no surprise that X-ray and optical inspection are likely to augment, and in some cases supplant, in-circuit testing in the quest to weed out defects during board assembly.
Below are some of the major issues and trends in both the ATE and test standards domains.
The ascendance of surface-mount technology and the shrinking of component technologies add up to just one thing: a massive headache for manufacturing and test engineers. Automatic testing of boards as a means to identify manufacturing defects will be augmented--and in some cases supplanted--by optical and/or X-ray inspection. So, decision makers are moving toward a decision on in-circuit, optical, or X-ray testing, or a combination of all three.
More software will be introduced that will help engineers decide whether in-circuit, X-ray, or optical testing, or some kind of mix, is the best path to follow. Once a decision is made, the software will assist in collecting the right data to ensure that you're still pursuing the right path--that is, the software will confirm that your requirements haven't changed over time.
Leaders in the test industry are confronting the fact that in three to five years, many chips won't be testable at all. Even if one were to build a $10 million tester and spend four days testing it, that just wouldn't cut it. So, we will begin to see more novel test strategies emerge, supplanting some of those currently used in volume testing. Built-in self test, with much of the test capability designed onto the chip, will play a major role.
The new IVI specifications will be announced this year by the Interchangeable Virtual Instruments group, which has been meeting for two years. This consortium includes Agilent Technologies, Keithley Instruments, National Instruments, Rhode & Schwarz, Tektronix, and Teradyne. It's addressing instrument drivers and interfaces and how they will evolve. More and more, open consortiums will develop standards, rather than the traditional organizations engaged in standards.
Look also for standards that will strengthen the role of T&M in conventional computer environments, enhancing the computer's role in test programming, test analysis, and wafer sort.
The major conference of the year for test professionals, the International Test Conference, is themed "Stressing the Fundamentals." Organizers plan to focus on the dual challenges of providing high-quality, cost-effective tests and developing or adapting new test approaches to meet today's test demands. The ITC will be held in Baltimore, Md., Oct. 8-10. For information, go to www.itctestweek.org.
Teradyne introduced the first computer-controlled IC test system, the J259.
Hewlett-Packard released a watershed study of DRAMs that showed that U.S. components were dramatically less reliable than Japanese devices. This spawned a quality revolution, the Baldridge award, total quality management courses, and six sigma.
National Instruments cofounders James Truchard and Jeff Kodosky introduced LabView, a graphical test development environment, thereby pioneering the concept of "virtual instrumentation."
Boundary-scan testing was introduced to solve accessibility problems caused by advanced IC packaging, such as ball grid arrays. Boundary scan was developed by the Joint Test Action Group (JTAG) in Europe and adopted as IEEE Standard 1149.1.
VXIplug&play improved the interoperability of VXI (VME extensions for instrumentation) products from different manufacturers. It included a standard I/O library and specifications for building instrument drivers.
IEEE Standard 1149.4, the mixed-signal version of the digital boundary-scan standard, was announced.