Chip suppliers and automatic test equipment (ATE) manufacturers face a growing crisis that's likely to have a profound effect on designs: the time and cost of testing is dramatically rising to nearly uneconomical levels.
A result could be that chips ship without enough testing to meet time-to-market and time-from-purchase-order goals. That can—and has—led to performance and reliability problems in thousands of systems in the field. Or ATE manufacturers, trying to catch up, could continue to develop more sophisticated testers that cost more money, further exacerbating an already tense situation between chip and test people.
Consider what's happening. On the chip side, feature sizes shrink about 30% every three years. And, the size of chips increases approximately 12% annually while 50% or more new designs are begun each year. That's already a problem. Design productivity gaps are constantly growing larger (Fig. 1).
But furthermore, the sizes of defects that can cripple sections of a die don't shrink in proportion to the reduced feature sizes, making it easier for one defect to ruin a chip. Plus, gate delays are shorter, the number of interconnect layers are increasing, and interconnect delays are becoming dominant, reports a survey from Mentor Graphics Corp.
Device complexity also increases with proliferating system-on-a-chip (SoC) designs. The Mentor Graphics survey notes that these large-gate-count designs may contain such features as a microprocessor, buses, peripherals, ASIC sections, compiled embedded software, IP cores (hard, firm, and soft), reusable blocks, multiclock domains, multifrequency domains, PLLs and on-chip-generated clocks, multiple embedded memories, and analog and mixed-signal components. That's a tall order for electronic design automation (EDA), let alone testing.
An SoC Trend
The trend toward SoC-like design is increasing rapidly. According to Rodger W. Sykes, vice president of marketing and business development for LogicVision Inc., about 1100 designs this year will be of devices with more than 1 million gates. He claims that by 2003, it's expected that there will be 4000 such designs.
This level of complexity requires a lot of test data for ATE equipment to manage—up to 1 kbyte per gate, for example. This means that up to 1 Gbyte of test data is required for a 1-million-gate device, and up to 10 Gbytes are necessary for a 10-million-gate device.
Moreover, test times to achieve fault coverage are becoming longer. "Although running 10 million clock cycles would achieve very good fault coverage for a 1-million-gate design, testing with that many cycles would take far too long," says L.T. Wang, president and CEO of SynTest Technologies Inc. "Even a barely adequate 10,000 clock cycles would take too long," he adds. "That's because there are six million potential faults in a 1-million-gate design, in checking the ones and zeros for the two-input and one-output port per gate," Wang explains.
"And, that's not even for full fault coverage, which would take too much time," says Sykes. To cope with this increasing chip complexity, ATE vendors offer high-end digital testers costing $4 million and up, observes Sykes. "As chips get even more complex, testers will cost $10 million to $15 million in a few years," he says. "It will come to a point that it costs as much to test the silicon as it does to make it," Sykes declares. "And that's just economically unacceptable."
Also alarming is a rapidly growing "tester gap." In the next decade or so, tester accuracy will move from 200 ps to 175 ps. But chip clock rates will increase to 3 GHz (a period of 330 ps), according to Mentor Graphics. That would leave an approximately 50% margin of error between chip performance and ATE capabilities.
Not only that, yield losses on chips due to the inaccuracies of ATE equipment could rise to almost 50% in the next decade or so. A few years ago these inaccuracies were only at 10%. On new device designs, yields may be less than 20%.
If external ATE isn't fully up to the job, wouldn't it be prudent to offload some of the testing tasks onto the chip itself? This is what has been evolving under a concept called design for test (DFT). An umbrella term like EDA, DFT covers a variety of techniques to enlist the device itself in helping ease the pain of testing.
In this case, DFT includes several known and newer approaches that place various hardware and software features on-chip. These approaches include automatic test-pattern generation (ATPG), scanning in its various forms, fault simulation, and built-in-self-test (BIST), which comes in two versions called logic BIST and memory BIST.
ATE Alliance With BIST
Despite what some fervent DFT missionaries would like to think, it's generally acknowledged that no one part of the test equation can carry the day by itself. That seems to be acknowledged on the ATE side too. Teradyne and Credence, two large ATE manufacturers, have taken an equity position in LogicVision Inc., a supplier of BIST products, for instance.
"A big hurdle is getting design engineers to consider test earlier in the design process," observes Ian Burgess, the product marketing manager for logic BIST products with Mentor Graphics Corp. "Overall, test trends are driving designers in that direction," he remarks. Up until now, though, designers all too often created the chip and "threw it over the wall" to manufacturing for figuring out how to test it. DFT advocates claim that won't do any longer (Fig. 2). This might be particularly true with BIST, which carries DFT in a new direction.
The BIST concept has been used in the military-aerospace industry for decades to ensure the performance and reliability of the Space Shuttle, aircraft, and other mission-critical inventory. It began to surface in the semiconductor industry about 10 years ago.
A DFT technique, BIST accomplishes testing through built-in hardware and software features. BIST with DFT embeds controllers or small testers on-chip. The testers provide pattern generation, at-speed timing, selectable modes, and go/no-go or diagnostic tests. Implementing BIST is typically fully automatic and fits into an RTL design flow (Fig. 3).
The technique has wide applications, such as BIST intellectual property (IP) cores, hardware debugging, characterization testing, burn-in testing, board testing, and manufacturing testing, of course. It's interesting to note that although BIST has other names, like Embedded Test at LogicVision and Integrated Test at Fluence Technology Inc., it's still BIST.
Scan techniques are considered fundamental for BIST designs. Scanning simplifies ATPG, debugging, and diagnostics. Although it adds a little silicon overhead, scanning is an effective and predictable process, offering increased control, observation, and test capabilities. In addition, it offers simpler test control, as well as improved hardware debugging and verification.
Aside from scanning, a good BIST design should have scalability in terms of connectivity, clocking, and data propagation within the device. ATPG can be used with it for higher fault coverage. "Today, in fact, ATPG is 75% to 80% of the test market while BIST is still fairly small," remarks Burgess.
BIST also works in mixed-signal environments where issues parallel those with logic and memory but with a twist. "Mixed-signal content and complexity also are increasing, engendering a large productivity gap between design and testing that's even worse than the one between manufacturing and design," says Mike Kondrat, marketing vice president of Fluence. There's a lack of test automation tools in an environment where the cost of testing is driven by test complexity, test time, and accessibility.
"The analog/mixed-signal BIST market is experiencing very strong growth," notes Kondrat. "That's because designers building SoC designs are looking at analog/mixed-signal BIST as the only method for solving difficult test problems associated with those tricky technologies," he says. "Such BIST solutions need to be developed in conjunction with customers to meet 'real world' test requirements."
Furthermore, "Measuring analog signals is inherently more complex versus digital measurements because one is measuring a probable outcome around some specification," explains Kondrat. This outcome also is affected by noise, voltages, and timing, whereas in digital testing, one is usually looking at a determined state of a one or a zero.
"Making BIST more useful is the ability to integrate the BIST solution into a project and distribute the solution between the silicon chip and the ATE system," Kondrat explains. "Another benefit is that BIST IP is reusable and can be easily ported to many different processes ranging from FPGAs in CMOS to ultra-high-speed silicon-germanium (SiGe) processes." Kondrat remarks further, "Converting analog/mixed-signal information to the digital realm can be implemented through industry-standard interfacing, such as the IEEE-1149.1 scan."
A plus according to Kondrat is that "these BIST solutions reduce the rising cost of testing." He explains, "They enable designers to use simple digital testers on complex high-speed digital or analog/mixed-signal SoC designs that may otherwise require high-end, expensive mixed-signal test equipment." What's very good, Kondrat states, is that "this gives lower-cost testing capabilities that cover the widest range of difficult applications, which can significantly add to the test cost."
"For BIST to succeed," Kondrat claims, "it must be more cost-effective than traditional ATE systems, have a high-performance capability that addresses high speed with high accuracy, be easily re-usable in several designs or processes, have easy integration and verification so that BIST can be designed into an SoC quickly, and have software automation tools with the ability to analyze and develop BIST solutions via software automation."
BIST does entail some issues concerning silicon overhead. For example, a 20,000-gate BIST block would loom large in a small design but be fairly miniscule in a 1-million-gate device. This touches on an allied but important trend in design methodology: reusability. This is critical in closing the design productivity gap because test reuse makes for effective design reuse. Reusing proven IP cores for BIST, other test methods, and device designs themselves could considerably speed up time-to-design, time-to-test, time-to-manufacture, and time-to-market.
Not everyone is sanguine about today's BIST. David Hsu, group marketing manager with the Test Automation Group at Synopsys Inc., thinks that the logic BIST on the market is somewhat premature because it leaves concerns about constraints in timing, area size, and performance.
So, Synopsys is joining the BIST fray. The company has announced a memory BIST product for early next year and plans to announce a logic BIST product fair-ly soon because it's the number one request from customers.
Looking forward, BIST companies are creating seamless, or nearly seamless, DFT flows. Synopsys, for one, is pursuing a hierarchical, top-down concept in which EDA test tools begin at the RT level and are integrated with synthesis. Test synthesis accounts for layout issues and is integrated with physical design tools. Test synthesis can directly synthesize all DFT structures, including BIST, with full constraint optimization. Plus, the concept includes complete automated creation, verification, and management of design data created and consumed by EDA test tools, the company says.
Mentor Graphics states that it has a unified approach to DFT by combining test synthesis, ATPG, memory BIST, logic BIST, and boundary scanning. Among other functional elements, the approach includes features for graphical testability troubleshooting, vector interfaces for writing ASIC vendor-specific test-vector formats, and an integrator for test-access synthesis, test reuse, and test-set integration. SynTest Technologies is another vendor offering suites of BIST and DFT tools.
Companies like LogicVision emphasize a total BIST approach. This particular company includes a wide range of embedded test and diagnostic capabilities on its BIST boards (Fig. 4).
Another trend encompassing BIST and DFT tools is the use of worldwide resources to hasten the development of test designs for high-gate-count SoCs. In an almost "divide-and-conquer" strategy, worldwide design teams would break down a complex design into segments with a specific team working on each segment, notes Burgess. "Using integrated BIST, DFT, and other design tools, these teams could stitch together a complete design using the ubiquitous and efficient Internet for to-and-fro communications," he says. The Internet also is time independent for worldwide design teams.
Downstream, some observers foresee ATE vendors—in an effort to simplify testing and lower costs, or at least keep them from rising too fast—relinquishing test functions best handled by BIST to concentrate on those best undertaken by ATE systems. There may, however, be some incompatibility issues concerning test methodologies to work out before that can happen. But BIST will have truly arrived when that day comes.
BIST is a methodology that's growing to enable on-chip test structures to help close the testability gap for SoCs and other large designs in concert with ATE. No one methodology will close the gap, but BIST integrated with other DFT approaches, such as ATPG and scanning, can offer designers powerful new ways to implement on-chip testing.
Proponents say that BIST won't eliminate ATE, but it can significantly lower test costs. BIST seems to be changing from an alternative method of testing to an economical and preferred method of testing complex devices.
|Companies Mentioned In This Report|
Fluence Technology Inc.
Fax (503) 672-8700
Fax (408) 467-1180
Mentor Graphics Corp.
Fax (503) 685-1214
Fax (650) 965-8637
SynTest Technologies Inc.
Fax (408) 720-9960