Jitter Testing – A High-Throughput Jitter Analysis Test Strategy
A promising solution for economical precision test can take your jitters out of jitter testing.
Multigigabit high-speed serial (HSS) interfaces like PCI Express, Fibre Chan-nel, XAUI, and Serial ATA risk slow industry adoption without a cost-effective production test solution. Traditional synchronous digital ATE solutions and even traditional jitter analysis instruments like bit error rate testers (BERTs) and scopes either lack the necessary performance and capabilities or are too slow and costly for high-volume manufacturing of these devices.
By enhancing low-cost loopback test with the high-throughput jitter analysis capabilities of continuous time interval analysis (CTIA) technology, it is possible to achieve quality test coverage and a good return on investment (see sidebar). This approach offers the benefits of faster time to market, lower field returns, and accelerated debug of unexpected process variation issues.
Pay Now or Pay Later
Industry analysts have reported that in 2005 80% of PCs will support PCI Express chip-to-chip interconnect and 100% of PCs the internal high-speed storage interconnect Serial ATA.1 But other advanced technologies such as Rambus DRAMS (RDRAMs) have failed to quickly reach their expected adoption rate due to slow implementation of adequate, cost-effective test solutions.
The test quality and cost issues must be resolved before consumer, storage, and networking devices relying on these phase locked loop (PLL)-based HSS I/O technologies ramp up into high-volume production. The lack of a viable HSS interface test solution in high-volume manufacturing is quickly reaching crisis level for many IC manufacturers and ATE companies.
As evidence of the urgency of this matter, pick up any test-related journal today, and there's a good chance that one or more articles will address the issues of jitter and the qualification of multigigabit interfaces. Industry standards conferences such as the 2004 PCI Express Developer's Forum address the question, What is the biggest jitter-related concern for PCI Express implementers?• The answer is not that there is no solution for jitter verification, for indeed there are many.
The real concern is that $120k for a pair of BERT channels or $250k for an eight-channel ATE option is too expensive outside of the characterization lab. This is especially true when the source-synchronous ATE option doesn t have the capability to capture asynchronous clock-embedded signals or test jitter with high accuracy and throughput.
Until recently, device interface timing was characterized in the lab and guaranteed by design. Due to the desire to reduce test cost and a lack of adequate test solutions, jitter test often was omitted at final test. However, the risk of customer returns is increasing for jitter-sensitive ICs that incorporate the latest in HSS interfaces. Although IC manufacturers wish to minimize test cost, it may become more costly in terms of lost revenue or yield when low-coverage or no-coverage test solutions are used.
Jitter-Enhanced Loopback Test Strategies
Loopback techniques that feed a transceiver's output data stream back to its own receiver input and perform a gross functional self-test, validate basic interface functionality, and overcome the inability of ATE to test the asynchronous signals of HSS interfaces. In addition, loopback is a reasonably low-cost solution because test times generally are quite fast, on the order of 50 ms each for internal and external loopback tests depending on test pattern length and built-in self-test (BIST) complexity.
Table 1. Comparison of Simple and CTIA-Enhanced Loopback Test Coverage for HSS Interfaces
Although this technique certainly increases test coverage for HSS I/O devices, the loopback environment, especially the internal loopback environment, is very different from the normal operating environment of these devices in their system applications. And there are many timing parameters, even as simple as the skew across 32 PCI Express channels, that are crucial to assuring device interoperability in a system environment.
To ensure real-world device performance, jitter-related tests such as receiver sensitivity, receiver jitter tolerance, transmitter jitter generation, and receiver/transmitter (RX/TX) jitter transfer must be performed. Without stressing the RX channels by injecting jitter into the RX test signal, there is no certainty the RX will perform under dirty-eye conditions at the far end of a serial lane. And without precisely analyzing the TX jitter components, the risk remains high that devices that pass loopback test still may fail because of TX signal integrity degradation caused by connectors, switches, and PCB traces.
In addition, without a way to fully analyze and separate jitter components such as data-dependent jitter (DDJ) and periodic jitter (PJ), simple loopback affords only limited protection. Should unexpected process changes or field failures occur, simple loopback cannot increase production test coverage spontaneously when needed.
As a result, many companies that recognize that jitter testing will increase test coverage and quality assurance are experimenting with various design for test (DFT) and BIST solutions that add some internal AC timing checks. One published solution incorporates on-chip delay circuitry to perform some AC timing measurements like setup and hold times.2
Another on-chip solution can even measure random rms jitter (RJ) with some degree of accuracy as long as the test engineer provides separate precise PLL clocks to the TX and RX channels in a tightly controlled manner.3 However, none of the currently available DFT/BIST solutions have the complete test coverage necessary to ensure device performance in real-world environments without the addition of high-throughput precision jitter test such as is provided by CTIA technology.
Table 1 summarizes the key test requirements of source-synchronous serial interfaces and how well simple vs. enhanced loopback methods address these test requirements.
The Value of Jitter Test
Even though the end equipment entities are tired of the phrase correct by design,• the semiconductor suppliers have to grapple with the concept of cost of test vs. no test before they are convinced that adding jitter testing is worth the cost. But despite the additional test cost, enhancing loopback with jitter test capabilities is a reasonable investment that can yield surprisingly high returns in several areas. Consider the following analysis:
Test time costs for a device typically are anywhere from 1 to 10 cents/s. The variation in cost depends on the pin-count and performance level of the tester being used because this number is applied to cover all reliability and maintainability (R&M) issues and the initial tester purchase price as well as the overhead for the operator to test the device.
Device A has a run rate of 30k pieces per month with an average test time of 30 s. Adding a 2-s jitter test increases the test time to 32 s.
CTIA Explained and PositionedCTIA technology uses multiple counters to track both reference-clock and user signal pulse edges during the course of a measurement. The reference clock counter serves as a coarse measure of time while interpolator circuits resolve the fractional time difference between whole clock periods and the selected user signal edge.
Block Diagram of CTIA
In the interpolator circuit, the selected event, En, of a user signal starts the capacitor discharge, and the reference clock stops it. The capacitor charge is held so the ADC can measure the voltage and convert it to a digital time value equal to the difference between the user signal edge being measured and the next reference clock edge. This delta-time value is subtracted from the number of whole periods of reference clocks that has passed since the (T0, E0) baseline time and event occurred to produce the time stamp, Tn, for the selected edge of event En.
The measurement location in a nondeterministic serial data stream is controlled by the pattern marker because all event counts and measured edge time stamps are relative to a master (T0, E0) time and event reference. This technique provides the highest accuracy jitter decomposition as described in the fibre channel methodologies for jitter specification (MJS) method, Jitter Measurements Using a Known Pattern and Pattern Marker.
Having a virtual rather than a physical pattern marker means that the CTIA does not require the 10-s to 20-s synchronization time that a noncontinuous TIA typically requires to lock onto a serial signal before any subsequent jitter measurements can be made. Additionally, the CTIA virtual-marker approach avoids the possibility of repeated but failed attempts to lock. Noncontinuous TIA may fail several pattern synchronization attempts in the presence of large jitter, which is another source of long test times.
In addition to providing better jitter measurement capabilities, the use of CTIAs also is economically justified. Digital ATE and traditional jitter instruments are too costly for HSS I/O production test.
Digital ATE
Digital ATE pin electronics provide an indirect method to search for edge timing in a repeating data bit stream. Recent ATE offerings have attained the 2.5-Gb/s to 3.2-Gb/s data rate needed to capture PCI Express or XAUI signals. However, the technique is slow, taking about 60 s on average, and not a precise direct measure of timing.
In addition, this digital search technique does not capture the time-correlation information necessary to analyze the components of signal jitter. Also, most ATE digital pin electronics solutions do not have the capability to lock onto asynchronous, clock-embedded serial signals. This prevents performing simple functional tests such as validating data bit patterns in pseudorandom bit sequence (PRBS) or PCI Express compliance test patterns. Consequently, at $250k for eight channels, the digital sweeping window technique is an expensive investment that only provides a gross measure of total jitter and eye width.
BERT
BERT has long been considered the golden standard for datacom device qualification. But with test times that run in the minutes and hours, it is out of the question for high-volume manufacturing.
Oscilloscopes
Oscilloscopes recently have improved performance to nearly acceptable test times running in the range of 30 s to 60 s. But this still is too large a test-time overhead for consumer devices with ASPs of $10 or less.
The increased cost of test above the basic 1 to 10 cent/s ranges from 2 cents to 20 cents per device when we add in the 2-s jitter test. For 30k devices, adding the test increases total test cost by $600 to $6,000 per month.
But this is not the whole story. The jitter test is added to decrease customer returns, rework, noncompliant parts, and engineering debug. Compliance testing is a very important factor here. The part may work great on the tester, but because of the jitter components, it may not meet the system specifications.
In reality, the cited costs without a jitter test are misleading. When you take into account yield loss from rejecting good devices (false fails) because the test accuracy is not adequate and other retest and debugging efforts associated with returns, the additional cost of jitter test is small.
Figure 1 and Figure 2 show that when you add in the cost of fallout after final test (field failures), the cost of omitting jitter test is far worse than the cost of adding jitter test. If the yield goes above 90%, then the overall cost of yield loss due to omission of jitter test is reduced. But for high average selling price (ASP) parts such as system-on-a-chip (SOC) or ASIC devices, this cost can become very high.
These examples are all easily understood. What cannot be visualized, however, is the customer's view of a manufacturer's product quality assurance, which may have a negative longer-term financial impact if bad devices escape during final test.
Conclusion
Although IC manufacturers worry about paying for extra test coverage and adding cost to every device tested, a low-cost loopback solution enhanced by the CTIA's capability to perform accurate and repeatable jitter tests with very high throughput is a reasonable investment that can yield surprisingly high returns in terms of the following:
Guaranteed by test vs. guaranteed by design supports a higher ASP and more design wins.
Faster identification and debug of new designs and unexpected process variation problems lead to faster time to market and lower failure-analysis costs.
References
1. Morgan Stanley Equity Research, Hop on the Bus: E-Ticket Ride for ATE,• Semiconductor Capital Equipment Report, September 2003, pp. 2-6.
2. Mak, T.M. et al., Testing Gb/s Interfaces Without a Gigahertz Tester,• IEEE Design & Test of Computers, Vol. 21, No. 4, July-August 2004, pp. 278-286.
3. Sunter, S. and Roy, A., On-Chip Digital Jitter Measurement, From Megahertz to Gighertz,• IEEE Design & Test of Computers, Vol. 21, No. 4, July-August 2004, pp. 314-321.
About the Authors
Tammy L. McClure, vice president of marketing at GuideTech, is a 15-year semiconductor test-industry veteran. She also has held various technical test engineering, business development, and marketing positions at Hewlett-Packard (Agilent), Tokyo Seimitsu, and Altera. Ms. McClure earned a B.S.E.E. in 1989 from San Francisco State University. GuideTech, 140 Kifer Ct., Sunnyvale, CA 94086, 408-731-8857, e-mail: [email protected]
Stephen L. Spencer is the IIBU System Applications Manager at Texas Instruments. Mr. Spencer has been employed by Texas Instruments since 1986, presently in the ASIC division. He received a B.S.E.E. from the University of Nevada and an M.B.A. from the University of Texas. Texas Instruments, 12500 TI Blvd., MS 8666, Dallas, TX 75243, 214-480-4023, e-mail: [email protected]
FOR MORE INFORMATION
on random jitter
www.rsleads.com/412ee-335
on data-dependent jitter
www.rsleads.com/412ee-336