Design For Manufacturing Must Move Up In The IC Flow

May 24, 2004
As long as there has beena semiconductor industry, each new generation of chips has been characterized by a drive for finer features. This drive has brought increased integration, enhanced functionality, density-improving optimizations, and new...

As long as there has beena semiconductor industry, each new generation of chips has been characterized by a drive for finer features. This drive has brought increased integration, enhanced functionality, density-improving optimizations, and new materials, resulting in smaller, faster, cheaper chips.

That is, it did until 130 nm altered the state of semiconductor design forever. Power issues, post-layout applications, and resistive and other speed-related defects—all formerly relegated to a background noise level—took center stage. These issues, which have a significant effect on chip yield (and failure), form the foundation for the new world of design for manufacturing (DFM). To attain a viable semiconductor industry for the 90-nm and 65-nm nodes, we first need EDA solutions that allow designers to optimize for manufacturing at the design, verification, tapeout, and test stages of the process.

Physical verification, being closest to the manufacturing line, went through the first wave of radical change when post-layout applications of resolution enhancement technology (RET), such as optical process correction (OPC), became mandatory at 180 nm. This first transition took physical verification tools out of the simple world of yes/no based on design-rule-checking guidelines to a world of optimization, in which issues like planarization, antenna effects, and OPC required the design data to be modified. Without these modifications, yield and functionality were at significant risk. In some cases, first- and even second-pass silicon success was elusive. Even more momentous was that the application of RET resulted in a mask that was unrecognizable by the designer yet produced silicon that matched the designer's intent.

Manufacturing test is also evolving. For nanometer processes, the standard "stuck-at" fault model is no longer sufficient. When designs moved to 130 nm and below, speed-related defects became more prominent. With lower yields and higher defect rates, more advanced test methods were needed to detect new defects, such as resistive bridges and vias. But more comprehensive tests produce a secondary problem: test data sets so large they prove unmanageable in terms of test time and cost. The struggle is how to move to advanced nanometer processes while controlling test costs and maintaining or improving the effectiveness of testing.

A second wave of change is afoot. It can be characterized by a move from yes/no to maybe. This wave will result in a completely new breed of tools that find ways to further optimize the design for manufacturing and let the designer see where the inability to do so will impact yields, either positively or negatively.

DFM supposes that a feature's manufacturability doesn't mean that it can't have a negative effect on yield. Tools that intelligently leverage extensive design-rule expertise, account for the impact of parasitics on timing and power, and work to detect and pinpoint areas of concern—whether they're performance degradation or yield loss—will enable designers to optimally tradeoff size, performance, and yield. This cost/yield analysis, conducted in full-chip context, can be made possible by broadened communication between designer and manufacturer that provides access to yield-limiting issues in a cross-layer, cross-hierarchical sense.

All aspects of the design-tool flow should consider DFM. It would be a mistake to leave optimization and tradeoff to the final tapeout stage. Far too many decisions about the design are etched in stone by this point, leaving little room for tradeoffs. The design-creation tools must radically cut down the number of issues left until tapeout for this stage of optimization to be effective. It makes little sense to have automatic routers place single vias if the first thing done in a DFM stage is to double a high percentage of them. Rather, a DFM tool would look for via doubling opportunities in "exception" cases.

Still in its infancy, DFM has yet to be fully defined. But with the work being done today in yield issues at 90 and 65 nm, a foundation is being established for the future.

Sponsored Recommendations

Near- and Far-Field Measurements

April 16, 2024
In this comprehensive application note, we delve into the methods of measuring the transmission (or reception) pattern, a key determinant of antenna gain, using a vector network...

DigiKey Factory Tomorrow Season 3: Sustainable Manufacturing

April 16, 2024
Industry 4.0 is helping manufacturers develop and integrate technologies such as AI, edge computing and connectivity for the factories of tomorrow. Learn more at DigiKey today...

Connectivity – The Backbone of Sustainable Automation

April 16, 2024
Advanced interfaces for signals, data, and electrical power are essential. They help save resources and costs when networking production equipment.

Empowered by Cutting-Edge Automation Technology: The Sustainable Journey

April 16, 2024
Advanced automation is key to efficient production and is a powerful tool for optimizing infrastructure and processes in terms of sustainability.

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!