Two changes will occur in manufacturing test products in response to market pressures: embedded compression tools and logic built-in self test (LBIST) will converge; and manufacturing test tools will be integrated with yield and failure-analysis tools and looped back into design tools to improve yield for ICs at 65 nm and below. For detecting defects at these smaller geometries, a broad range of sophisticated fault models is needed to ensure sufficient test quality. As a result, the volume of scan test patterns has increased one hundred times. In response, many IC manufacturers have turned to the use of embedded compression methods, which allow additional high-quality tests to be run while still maintaining test throughput and cost. Current compression-based automatic test pattern generation (ATPG) tools have reduced test data volume and test times by anywhere from ten to one hundred times, depending on the specific design. A recent study by the International Technology Roadmap for Semiconductors (ITRS) group predicts that IC manufacturing test will require one thousand times the data volume compression in five years. Our technical research shows that compression will be enhanced by one thousand times the current achievable capacity within this time window. Production results since the first embedded compression ATPG tool, being Mentor’s TestKompress, was introduced in 2001, have shown that using embedded compression markedly improves defect coverage and test quality for IC nanometer designs without increasing test times. Chip designers use LBIST in situations where non-vector testing is required, such as in-system test, burn-in, and simple in-package test. For example, manufacturers of safety critical systems need very high quality for initial manufacturing testing and also need a way to do a self-test in the field. Although LBIST can be an ideal solution to satisfy the in-system test requirements, it may not supply the high quality and coverage needed for manufacturing test. LBIST typically will not achieve as high coverage as deterministic ATPG can and cannot target all of the fault models deterministic ATPG could. Also, LBIST can’t handle unknown values (or X states) that may occur during test—the test will fail if the output contains even a single X state. This requires test engineers to design X-bounding circuitry into the design to ensure that no X states propagate during test. Because of this, more design iterations are required and test coverage on much of this added logic could be lost, reducing the quality and accuracy of test. Eliminating dynamic X states caused by false and multi-cycle paths during at-speed test is even more problematic. With today’s increasingly complex circuits, designers are standing on their heads trying to make their in-house technology work to address these challenges while containing test cost and maintaining yield. Using commercial test tools that combine embedded compression with LBIST would be a more cost-effective alternative with higher quality. Combining these test methods will allow more scan patterns to be run, ensuring higher quality and more efficient testing. Because of these pressures, manufacturers that use LBIST will probably start incorporating embedded compression technology as well. Another change in manufacturing test will occur because conventional techniques for improving yield and reducing defects-per-million (DPM) rates for nanometer ICs, such as physical failure analysis, have been negatively affecting time to market. In response, EDA vendors are addressing this critical challenge by creating solutions that feed test-failure diagnosis data from manufacturing back into the design process. This will enable dramatically faster yield learning, improving time-to-yield and reducing manufacturing costs. For example, failure data can be analyzed to identify statistically significant failures that indicate an area of the design, which has higher than expected random failures and possible systematic defects arising from process limitations. By correlating these failures with physical verification and design-for-manufacture (DFM) analysis, failure analysis and design teams quickly can zero in on the likely causes, such as areas where interconnects are too dense or where there are lithographic hotspots with features that are difficult to print. Designers will be able to use volume failure data to identify new DFM rules to eliminate the most yield-limiting design features at the design or physical verification stage. Moreover, the feed-forward loop from DFM to test will improve test quality by enabling targeted, deterministic testing for identified hotspots. It makes sense to use the failure data collected from manufacturing test results for faster identification of systematic failures. For example, one of our customers was able to achieve a 24% improvement in yield in two days of diagnosis efforts using YieldAssist Analyzer (W. Hsu, et al., “Scan Diagnostics in the Nanometer Design Era,” Semiconductor Manufacturing, March 2006). Future automation streamlining, improved test accuracy, and accurate analysis will greatly accelerate yield learning and reduce the time to yield for new designs. Expect the integration of test, failure analysis, yield analysis, and physical design tools, such as Mentor’s Calibre Yield Analyzer, Calibre Yield Enhancer, YieldAssist, and Calibre nmDRC, to add value to IC design and manufacturing processes, reduce overall DPM levels, and decrease time to market.