EDA: Prototyping & Implementation

Jan. 6, 2003
Design Analysis Strikes A Balance

At 100 nm and below, designers of ASICs and SoCs find themselves facing a Gordian knot of physical roadblocks to implementation. In 2003 and for the foreseeable future, silicon virtual prototyping will become an essential technique for driving physical information further up into the logical design process. Only by accruing real, accurate information regarding interconnect delays can design closure be achieved in a reasonable amount of time. The alternative is an unacceptable amount of guardbanding and good old-fashioned guesswork, not to mention endless spins between synthesis and placement and routing.

It's an accepted fact that the handoff point to implementation will no longer be the gate-level netlist. Also accepted is that design analysis will be even more important, and on more fronts, than ever before. A fine balance must be struck between speed and detail. At the gate level, analysis loses too much detail to be useful. But doing too much analysis at the transistor level takes too long, stretching verification cycles to the horizon.

Timing analysis needs to evolve to incorporate dynamic timing. Likewise, static power analysis is becoming a liability. Static representations of inherently dynamic nanometer effects, such as IR drop and coupling-induced crosstalk, are becoming untenable. The poor modeling approximations taken in static timing inject a growing amount of error as the process geometry shrinks. This forces designers to grow their guardband to include its effect. Post-layout verification must strike that balance between analysis at the gate and transistor levels, particularly when the interconnect delays dominate.

Poorly understood in some quarters, design-for-test (DFT) technologies will continue to gain more importance when it comes to design. The industry as a whole must make greater use of DFT. It's often inserted into designs as an afterthought, although when used properly, the quality and economic benefits of DFT are demonstrable and critical to achieving acceptable yields.

At the back end of the implementation process, the increasing use of resolution enhancement techniques (RETs) has enabled lithographers to make Moore's Law hold up longer than pundits thought it would. An emerging new generation of EDA tools communicate with standard EDA formats and include internal algorithms and simulators tuned to the process requirements of RET. Library generation tools that produce RET-compliant layouts are a step forward.

TOP TEN >COMPLEX FPGAS will see greater use in prototypes and early production, and even for full production in new application areas where ASICs would have once dominated. Early adoption in the FPGA arena of physical design tools is expected for timing closure, hardware/software co-design and verification for embedded FPGA system design, DSP-specific toolsets, and formal verification of FPGA prototypes vs. ASICs. As FPGA vendors drive cost-per-gate down to historically low levels, programmable devices become attractive to a much wider audience, including consumer and automotive sectors. They should even see more use in the low-volume, high-cost military/aerospace sector.

>CHIP-LEVEL DFT will see an increase in the importance of IEEE 1149.1 Boundary-Scan (BSCAN) as a control mechanism for BIST engines, and as part of the test mechanism itself. Already experiments with boundary-register design target synchronization with Internal Scan (ISCAN). As a result, both stuck-at and delay fault testing can be executed using the boundary register combined with the internal scan chains. This will allow complete testing of the I/O shadow region without assigning tester pins to every device pin.

>DIAGONAL INTERCONNECT ROUTING may finally become production-ready in 2003. The X Initiative, a broadly supported semiconductor industry consortium, is pursuing development of an interconnect architecture that rotates the primary direction of the interconnect in the fourth and fifth metal layers by 45° from a Manhattan architecture. Leading vendors of routers are actively investigating the trend and/or developing tools to support it. Also under development is parasitic extraction technology that can model the resistance and capacitance of pervasive diagonal wires. Expect major announcements in this area.

>INDUCTION EXTRACTION will become critical in analog/mixed-signal (A/M-S) design flows at the 90-nm process node. Interaction between different tools used in the design process, during process development, and in mask making must be tightly integrated. Of particular importance will be the ties between technology CAD tools employed during process development at foundries and the RLC characterization of interconnect structures that forms the foundation models for layout extraction tools.

>ADDING PROCESS SIMULATION layout verification tools enables process effects to be automatically predicted and flaws reported as potential design rule violations. Meanwhile, the same interface familiar from typical DRC checks is employed. As tools implementing RET-compliant knowledge become standardized, the information they require will also become part of the standard package to be downloaded along with the classical design rules. Designers won't need to take seminars on diffraction, nor will lithographers have to learn how to debug the register transfer level. Process-aware EDA tools will continue to evolve to fill this void.

>DYNAMIC POWER NOISE posts a new challenge to existing static-timing and static-power signoff methodologies. Full-chip dynamic effects from on- and off-chip inductance, simultaneous switching events, and the ad-hoc nature of the designed decoupling capacitance are becoming the dominating constraints to the power integrity of SoCs. The lack of appropriate vectors from ASIC netlist handoff makes transient analysis of power grid noise using high-capacity transistor-level simulators unfeasible. Watch for tools in this area to make dynamic power analysis a reality. It's already lacking at the 130-nm process node, and will be that much more critical at 90 and 65 nm.

>NANOMETER ROUTING will emerge to optimize on-chip interconnects. Traditional grid-based routers don't offer native support for variable-spacing, variable-width routing with on-the-fly signal integrity prevention and correction—all essential elements of a nanometer router. Nanometer routers must also be five to 10 times faster than traditional routers, and take advantage of multiprocessing, to provide overnight iterations. Lastly, the system must host highly accurate nanometer analysis, including fast, nanometer-grade delay calculation that accounts for signal integrity and IR-drop effects.

>FULL-CHIP POST-LAYOUT VERIFICATION and analysis will become an important part of nanometer chip design. Only detailed nanometer-ready verification technologies can combat the insidious imprecision that makes an impact on timing and power, as well as signal and design integrity. All analysis and verification steps carry an error margin that increases at nanometer geometries. As the error bound increases, precision suffers and bugs creep through to final silicon. The increase in re-spins at 130 nm testifies to the failings of static gate-level sign-off, and raises concerns about the success of relying on this approach only for sign-off.

>EDA AND TEST VENDORS WILL PARTNER to address the poor or nonexistent flow of data from the test process back into the design cycle. Unfortunately, only about 2% of today's design cycles are spent on design for test. Such partnerships will result in methodologies that avoid extra steps in the procedure of ensuring that IC designs are testable. The alternative is costly mask respins. What should help is to bring forward test validations into the designer's testbench flow, probably through the use of assertions. This should hold down mask iterations and alleviate yield problems caused by tight constraints. The goal is a form of virtual test validation.

>STANDARD-CELL LIBRARIES, the staples of ASIC and ASSP designers, limit the ultimate performance of the designs on which they're based. Standard cells place the onus on foundries and/or library vendors to define, implement, and characterize the cells while concealing the process details from the designer. A growing gap between full-custom and ASIC design performance results. Emerging tools transparently optimize cell-based designs at the logic and transistor levels to eliminate the barrier between placed-gate and transistor-level design representations.

Sponsored Recommendations

Near- and Far-Field Measurements

April 16, 2024
In this comprehensive application note, we delve into the methods of measuring the transmission (or reception) pattern, a key determinant of antenna gain, using a vector network...

DigiKey Factory Tomorrow Season 3: Sustainable Manufacturing

April 16, 2024
Industry 4.0 is helping manufacturers develop and integrate technologies such as AI, edge computing and connectivity for the factories of tomorrow. Learn more at DigiKey today...

Connectivity – The Backbone of Sustainable Automation

April 16, 2024
Advanced interfaces for signals, data, and electrical power are essential. They help save resources and costs when networking production equipment.

Empowered by Cutting-Edge Automation Technology: The Sustainable Journey

April 16, 2024
Advanced automation is key to efficient production and is a powerful tool for optimizing infrastructure and processes in terms of sustainability.

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!