One of the recurring questions over the last 30 years is “Why do we need to test chips?” Yes, in a perfect world, semiconductor test would not be necessary. But the world is not perfect, especially as we continue to push technology beyond the limits at the famous Moore’s-Law pace. So, the real question is “What will be the challenges of semiconductor test in the year 2000?”
I see three major changes facing the semiconductor test industry as we enter the 21st century:
- Speed and accuracy.
- System-level integration.
- Packaging and test process innovations.
Each of these topics presents challenges and opportunities to test suppliers and their customers. The International Technology Roadmap for Semiconductors (ITRS) defines many of these requirements and sets objectives for the industry.
Speed and Accuracy
From a conventional test perspective, it is off-chip speed that challenges ATE suppliers the most. On-chip speeds can be much higher, but the external tester cares about the timing and functionality at the device pins.
The ITRS projects off-chip nonreturn to zero (NRZ) data rates of 1,600 MHz by the year 2002. Chip frequencies have increased at a compound annual rate of 30% for the last two decades. However, ATE has only managed to get 12% faster per annum over that same time period. This creates a widening performance gap that threatens to significantly impact yields.
More important than actual tester frequency is the requirement for accuracy. The rule of thumb is to strive for an overall timing accuracy (OTA) of 5% of the minimum cycle period. So as we approach I/O speeds of 1 GHz, this means that an OTA of less than 50 ps or ±25 ps must be achieved (Figure 1, see below).
This is no small task for test systems that also have demands for increased pin-count, flexibility, and functionality. Since a tester depends on semiconductor technology to achieve its performance, the challenge of the ATE manufacturer is to use today’s technology to build systems that will test tomorrow’s ICs.
To date, this challenge has been met by designing for timing stability in the ATE system, then using sophisticated measurement and calibration techniques to compensate for systematic errors. As we progress along the roadmap, it will be increasingly difficult to measure these parameters, much less compensate for them.
Specsmanship has become a way of life in the ATE industry, and vendors tend to define the accuracy of their systems within a finite set of conditions. At this year’s International Test Conference (ITC), one ATE marketing executive joked that OTA really stands for “our timing accuracy.”
OTA standards have been attempted without success. Independent time-measurement instrument providers such as Wavecrest in Edina, MN, have developed tools to measure edge-placement accuracy and stability over time. These have yet to show that ATE accuracy really meets the specification over all ranges of real-time use.
So where will the solutions come from? Naturally, ATE suppliers will evolve the accuracy performance of their systems through continuous improvements in design and calibration. This will be fueled by customer and competitive pressures.
We can expect to see more system-on-chip (SOC) integration of the tester channel to contain and control performance. However, this performance ultimately is limited by the fixturing between the tester channel and the device under test (DUT).
The real solutions will be on the DUT in the form of built-in self-test (BIST) or embedded test, as it is called by LogicVision, a tool supplier in San Jose, CA. With BIST, the chip under test does not have to tolerate the off-chip interfaces and signal plumbing that can only degrade performance (Figure 2).
Admittedly, embedded test, while it will apply to large designs, will not be for everyone. For the next few years, ATE vendors still must keep pushing the envelope toward tighter accuracy, and everyone will need to refine interface design for maximum bandwidth.
System-Level Integration
The term SOC is overused in the industry, primarily as a vehicle for marketing hype. Market forecasters cannot settle on the definition for SOC. As a result, none of their forecasts agree in actual market size; however, they all show very strong growth in the years ahead.
I am not talking about SOC as a product, but about a process to integrate more features onto a single, more complex piece of silicon. Consequently, system-level integration (SLI) is a better term. SLI has the following attributes:
- Integrates multiple logic, memory, and analog functions.
- Uses embedded cores of existing functionality from internal or third parties.
- Maximizes the concept of design reuse to speed time-to-market.
From a test perspective, this means that more chips will require the use of mixed-signal ATE to test the multiple functions in a single insertion. In 1999, we have seen the acceleration in sales of mixed-signal testers and I anticipate this will continue through 2000 and beyond. These new chips will require systems with digital data rates in excess of 500 MHz in conjunction with 5-GHz sampling, at least 16-bit resolution, and a noise floor approaching -100 dB. Not an insignificant task.
These systems will be adequate for the simpler SLI chips, but even they do not address the problems of core accessibility and test-program reuse. The only effective way to address these is through design for testability (DFT) techniques.
As for the at-speed problem, I see the white knight of the 21st century coming in the form of BIST. This is being embraced by some ATE manufacturers as a solution to the roadmap challenges, not as a threat.
A year ago, Credence Systems and LogicVision announced a partnership to integrate embedded test tools into the tester operating system, although product announcements have yet to be made. Around six months ago, Teradyne announced a strategic investment in LogicVision and a similar partnership to develop products.
At this year’s ITC, Teradyne showed the LogicVision toolset for embedded test integrated into its Catalyst SOC tester. The demo effectively debugged at-speed logic and memory faults on an actual test chip. The tools are intuitive and enable the test engineer to extract fail data in a logical form that can be used to directly communicate with the designer. This level of access from the external ATE into the embedded cores will be necessary to effectively test 21st century SLI products.
With BIST/embedded test, the test engines are compiled right into the silicon and are as portable or reusable as the silicon design itself. Coupling this with integrated test and debug tools will enable the test engineer to speed time-to-market.
Test Process and Packaging Innovations
Not all challenges and opportunities will revolve around chip features and performance. In the next decade, we will see packaging and process changes affecting test.
Chip scale packages are in production today, but as they ramp up, we will see increased levels of strip-based handling throughout the back end. These arrays of chips on strip frames will support more efficient handling for test, particularly in parallel applications.
As these processes evolve into wafer-level packaging techniques, the industry will have an opportunity to eliminate a test step. Since the package will be applied to the entire wafer, why not test only once at the end of packaging, prior to chip separation? Of course, memories requiring redundancy repair still will have to be tested and laser-repaired prior to package application. However, many other high-volume parts will find wafer-level packaging appealing.
Another test-process simplification is the approach of driving more test steps back to the wafer-probe and burn-in test steps, lightening the load and potentially eliminating high-speed package test to reduce overall test cost.
Conclusions
The year 2000 will bring with it more evolutionary demand for high-speed/accuracy performance and mixed-signal functionality in ATE. New packages and test processes will streamline manufacturing. The revolutionary change will come in the form of more on-chip self-test solutions to lighten the load on conventional external ATE. This will be accommodated by a combination of practical tool sets for design and test as well as compelling economics as people realize it is less expensive to actually test on-chip.
About the Author
Ron Leckie is CEO and research director of INFRASTRUCTURE. He has more than 29 years of experience in the semiconductor industry, including 11 years as vice president of engineering and then marketing at Megatest and 14 years in test engineering. Mr. Leckie graduated in 1970 with a BSc in electrical and electronic engineering at Heriot Watt University, Scotland. INFRASTRUCTURE, 18779 Kosich Dr., Saratoga, CA 95070, (408) 255-0853, e-mail: [email protected].
Return to PC-Based Test Online
Published by EE-Evaluation Engineering
All contents © 1999 Nelson Publishing Inc.
No reprint, distribution, or reuse in any medium is permitted
without the express written consent of the publisher.
December 1999 |