Not only must the EDA industry satisfy designers' thirst for tools that can untangle a thickening knot of physical effects, it must do so from higher and higher vantage points with regard to levels of abstraction. More physical detail and more abstraction would seem like a contradiction in terms, but that's just what's on tap for next year.
SoC designers can expect a move toward design methodologies that advance more implementation-level detail in the tool chain thanks to increased adoption of system-level design languages. Meanwhile, synthesis technologies are evolving to bring more physical data into the derivation of netlists. These, in turn, result in more accurate place-and-route runs and a better shot at achieving timing closure.
On the design languages front, there's a growing interest in moving up from HDLs to a higher level of ab-straction. Such approaches include C/C++ and the various flavors on the market as well as Superlog, a next-generation version of Verilog.
In many ways, timing closure itself is taking a back seat to other concerns for IC designers. Toolsets must change to reflect those concerns. Chief among them is power consumption. CPUs are on a path to dissipate very large amounts of power. Designers have been spared some pain because the voltages have been dropping. Further VCC drops, though, will probably bring substantial signal-to-noise ratio problems, making analog circuits very difficult to build.
That doesn't mean that timing closure will no longer be a concern for designers or that tools won't address physical design issues. At line widths of 0.13 µm and smaller, a host of physical effects must be accounted for to meet timing. Increasingly, the challenges of inductance extraction will be seen as a key hurdle to successful design. At this time, there's a lack of extraction capability for inductance on a full-chip level. The same holds true for crosstalk, a huge problem at ultra-deep-submicron levels. Crosstalk has even begun playing a role in digital design.
One technology that could mitigate these issues is silicon virtual prototyping. This is a step in the design process that can help to achieve the goal of feeding more accurate physical information up into the synthesis process, yielding a better netlist for subsequent placement and routing.
Before designs can be signed off on, they must be verified. But functional verification is another technology that has room for growth and improvement. The "verification bottleneck" has been addressed from the hardware side with ever-larger and more powerful simulation accelerators. But the problem remains in creating the right vector set to properly exercise the design.
A new tool paradigm and methodology is emerging to systematically address the design and verification bottlenecks that occur in the system-design process. It's becoming untenable for system, hardware, and software teams to work in a serial flow. Product architectures need to be thoroughly analyzed before handing off requirements to hardware and software design teams. Software debug and verification can't wait until the hardware is built. Time-to-market pressures just won't allow it.
The often mentioned, yet nonexistent "intelligent testbench" would go a long way toward fulfilling the goal of rapid but thorough verification. Formal verification is making inroads, even at the high architectural level, in determining that circuits provide the function intended by their designers. Here again, backing up a level of abstraction to the algorithmic level could aid in making hardware-software functional partitioning decisions early in the design process.
Design methodologies involving programmable logic are seeing a sea change too. Whereas in the past FPGAs were considered a quick-and-dirty means of getting a specific function on a board, they're now becoming part and parcel of ASIC methodologies. FPGAs are increasingly viewed as a technology to be included in the context of larger ASICs.
FPGAs themselves are becoming quite large even as design resources grow scarcer than ever. With 50-million-gate devices on the horizon, tool vendors are gearing up capacities and providing tools that make large designs easier. Watch for this trend to continue because FPGA tools must scale with the silicon over the next several years.
As challenging an area as any for the EDA industry is RF design. For the RF IC designer looking to reduce costs, the goal is to get as many passive components onto the chip as possible. RF-oriented EDA vendors will continue producing tools for such design techniques and push toward linking the design tools to test and measurement technologies.
ICs, whether for RF applications or not, don't exist in isolation. Vendors of pc-board design tools are boosting their research-and-development budgets, even in the current business climate, to meet the signal-integrity challenges presented to board designers by devices such as high-speed SERDES (serializer-deserializer) ICs, and 10-Gbit optical interconnects. Vendors of pc-board tools are working to incorporate such capabilities into their products.
So once you have chips that work and a board designed, you're facing a larger system-level design problem. Increases in the complexity and electronics content in automobiles and aircraft are forcing changes in the CAD tools used to design electrical distribution systems and wire harnesses. Trends developing in the electromechanical design of wire harness systems include wire synthesis, autorouting, and automatic wire diagram generation.
Geographically dispersed design teams create a broader problem for the EDA industry. New Internet and browser technologies are creating a new networking infrastructure that opens the possibility of bundling design collaboration tools into the traditional EDA design environment for quick, easy desktop sharing and online access. In 2002, we'll see an upsurge in tools that provide the Web-based collaboration and communication infrastructures required for electronic design collaboration.
Yet another broad issue in EDA is tool interoperability. One effort to watch is the OpenAccess Initiative, which is spearheaded by the Silicon Integration Initiative Inc. (Si2). The widespread adoption of OpenAccess would usher in a new era of tool interoperability never before seen in EDA. A number of key tool consumers are already behind the effort. Meanwhile, tool vendors are evaluating the application programming interface (API) and database specifications.
RF designers can look forward to tools that help eliminate off-chip passives and their performance-draining characteristics. Expect more tools for designing inductors, transformers, and filters into the ICs. Technologies as advanced as building acoustic resonators in silicon substrates to produce very high-Q filters are afoot, but again, toolsets for such design are lacking.
Expect to see more tools that address inductance extraction, which is increasingly an issue in achieving timing closure at clock rates of 500 MHz and higher. The embedding of RLC extraction within the place-and-route portion of the physical design flow will help, as will modeling of full-wave and mutual-inductance effects.
With clock rates spiraling past 2 GHz, 0.1-mm processes kicking in, and current drain increasing, power consumption is displacing timing closure as Public Enemy #1 in SoC design. Look for logic design tools that yield better understanding of power optimization and control. At the architectural level, designers need tools that can point the way to inherent power management that shuts down idle circuitry.
System-level design languages, C/C++ and Superlog among them, will advance in standardization, aiding in their widespread adoption. The recent approval of the Verilog-2001 standard (IEEE 1364-2001) is a step in this direction. It remains to be seen whether designers will move toward the top-down approach of object-oriented languages like C/C++, or the tried-and-true, bottom-up approach of behavioral Verilog extensions.
Silicon virtual prototyping will gain in prominence as a technology that fills the void between synthesis and place-and-route, feeding much-needed physical information back into the synthesis process. Virtual prototyping, which in the case of some tools is done on a full-chip level, can at least give the designer a platform from which to make decisions on block-level implementation. We can expect to see this technique gain in prominence in 2002.
At gigahertz clock rates, almost every element of a pc-board design is an effective RF radiator, leading to significant signal-integrity issues. Watch for pc-board tools that take more of a system approach to interconnect design. Users will see tools that integrate modeling and verification of the chip, package, and board environment for high-speed links.
FPGAs are climbing toward 50 million system gates by 2004. But the industry downturn has resulted in smaller design teams requiring tools that emphasize efficiency. Designers can look for FPGA design tools that offer advances in productivity and automation.
The 39th Design Automation Conference (June 10-14, 2002, New Orleans) will be, as always, the EDA industry's premier technical forum. It also is the showplace for the year's hottest new tools.
Functional verification remains a bottleneck as designers continue searching for the long-sought "intelligent testbench." Look for advances in automatic vector generation. Formal processes are emerging to tightly couple hardware/software design and verification flows with the original system specification.
2002 could see advances in tool interoperability between vendors, thanks to initiatives such as OpenAccess. Next year could be make-or-break for this effort. Version 2 of the OpenAccess API specification should appear by June's Design Automation Conference.