More so than ever before, complex ICs and the pc boards they populate make design for test (DFT) a basic necessity. Increasing complexity, however, continually challenges DFT designers to keep pace. At one time, IC designers designed a chip, debugged the design by verifying that it worked, and then handed it over to test engineers for production testing.
Nowadays, it's getting difficult to distinguish the IC designer from the test engineer. There's a critical need for both to work more closely with each other for more cost-effective testing solutions.
DFT is often touted as the means for IC and board manufacturers to realize significant economic savings. At the device level, it sometimes replaces the traditional functional testing role in which ICs are tested at their I/O for functional performance.
Design engineers need to address the test problem with prevention approaches that anticipate device and board failures, rather than use testing methods that detect problems after they occur. With the advent of complex system-on-a-chip (SoC) devices and the inevitable march toward ever thinner line geometries of 90 nm or less to pack more devices into a given area of silicon, traditional IC test approaches are proving impractical (Fig. 1). The testing problem is steadily moving back from the test engineering to the design engineering department and continues to take on a larger role in EDA tools.
Many testing problems are directly related to shrinking design margins and increasing architectural complexities. With fewer gates per clock cycle, IC designers are finding it difficult to obtain proper timing convergences (Fig. 2).
There's a radical shift away from the logic-centric designs of the 1980s, when IC designers were most concerned with "stuck-at-fault" failure conditions, to the latest-generation devices where IP core use and reuse are the norm. A need exists to get away from inefficient local optimizations of test solutions to more robust approaches.
DFT methodologies consisting of scan-test and built-in self-test (BIST) approaches are becoming mandatory for chip designs. The latest DFT methods now involve on-chip test subblocks to minimize testing costs. Newer DFT solutions like embedded deterministic test (EDT) and built-in self-repair (BISR) slash away at test times and thus lower costs.
Boundary-scan testing is particularly prominent in the board-level arena, where it has become an indispensable tool to cost-effective production testing. Much of boundary scan's value lies in the software drivers written for exercising test vectors. Boundary-scan testing reduces the costs of traditional board functional testing by speeding up the testing process. While few can argue that functional board testing is the ultimate means of determining that a board really works as it's intended, it's time-consuming and pricier, particularly as boards become more populated with denser components.
IMPROVING TEST QUALITY
Realizing high-quality test results (i.e., fewer defective devices that aren't detected at test time) cost-effectively is a major challenge. The fewer the number of defects undetected, the lower the "fallout" rate. The challenge becomes even greater as devices are tested at operating clock rates ("at-speed"), with the fallout rate climbing exponentially. Eventually, with shrinking IC device geometries, a point is reached where at-speed testing becomes impractical or even impossible. For example, at line widths of 0.18 µm and lower, the fallout rate becomes too high to be acceptable (Fig. 3).
Nvidia Corp. (www.nvidia.com), a company with extensive experience in at-speed and static testing, suggests using embedded deterministic scan-based test methodologies. The idea is to control testing costs while maintaining high-quality testing.
To significantly reduce the amount of test data required and whittle down the large number of I/O pins needed, highly compressed pattern sets 100 times smaller than the original size of the test set are created. A tester delivers these compressed test sets to a device. An on-chip decompressor then expands the test set, which goes to work via a large number of internal scan channels. Once the test is finished, the test set is sent out of the chip and compressed again. Decompression and compression logic is added outside the chip under test and is only inserted into the scan path when testing the device. Test data and test time were reduced by a factor of 100.
One of the issues facing IC and board manufacturers is the economic benefits resulting from using DFT-based approaches. Many of today's high-priced precision-designed IC testers are still being utilized for DFT-based testing for lack of anything more appropriate in the way of test gear. As a result, these testers are also being used for the functional testing, which was their original intent. This ultimately negates some of the cost savings DFT testing is said to provide, because functional testing takes more time to do.
But many studies by test engineers show that using functional testing for DFT-validated IC designs doesn't necessarily improve end-product quality. As a result, some large IC manufacturers no longer use costly functional testing of ICs after they undergo DFT validation. They're satisfied with the results of DFT validation at the IC design stage.
IMPROVING TEST QUALITY
It's becoming more obvious that the testing problem is a challenge that's getting tossed more often to the chip designer, not just into the lap of the test engineer. Regardless, many of these test problems still make their way toward test engineers.
For DFT methodologies to be truly cost-effective, the two teams must find a better way to work together as equal team players, and not as separate entities. Cost-effective testing with future high-speed and complex ICs requires serious thought early in the design phase. Only in this way can the product's quality be ensured throughout the entire manufacturing phase, including final production.