The design challenges implied by systems-on-a-chip (SoCs) are overwhelming. Yet, the design work is less and less of the overall battle. Verifying that you've designed what you intended has become the greater hurdle. This process will take inordinately longer unless significant advances in verification methodologies are implemented.
On the design end, SoC design methodologies must evolve to encompass much more intellectual-property (IP) reuse. One important element to this evolution will be separation of the control/transport part of design, in which IP blocks communicate with each other and move control signals around the chip, from the actual number crunching itself. The result will be a raising of abstraction, simplifying both design and verification. In addition, the IP blocks themselves must adapt to fit such methodologies by incorporating I/O that can be modified to fit various platform types and bus structures while the basic function remains immutable.
Likewise, verification strategies must continually raise the bar in terms of efficiency and coverage. Much hope has been placed in the notion of the "intelligent test bench." In other words, the automated development of functional verification vectors at a high level by using dedicated verification languages, such as Verisity's e or the OpenVera language. There's concern, however, that such approaches can result in long runtimes in verification while not necessarily providing the functional coverage required by high-end designs. The answer may be found in hybrid approaches that fold FPGA-based verification, hardware emulation and acceleration, and C/C++ test benches at cycle-accurate or bus-functional levels into the mix. For the foreseeable future, assertion-based verification will continue to gain ground as the best hope for steady improvements in verification efficiency.
>HIGH-END SoC composition is fueling a drive toward customer-owned tooling (COT) business models. Amalgams of custom and synthesizable logic, large amounts of memory (about half of many designs' transistor budget today and rising to around 70% in 2005) and sensitive analog blocks pose an enormous verification challenge. Not content with signing off at the gate level, these designers go all the way down to GDSII, performing layout, LVS/DRC checking, and final post-layout verification.
>SYSTEM VERIFICATION of pc-board designs within software is a key part of the design flow that engineers often bypass. Yet, pc-board designers face increased signal integrity and timing issues due to the higher clock frequencies and signal edge rates of high-speed, digital pc boards. Expect EDA and IC vendors to work together to offer high-quality models that support the latest high-end buffer technologies, making simulation easier.
>GLOBAL ORGANIZATIONS seek tools and methodologies that support dispersed design teams and account for the increased reliance on outsourcing. Design teams are moving away from mere design databases to a complete infrastructure approach. However, IP providers want to protect themselves. Online solutions will offer high levels of security that give customers complete control over the data flow. In addition, collaboration technology will accommodate cross-platform communication between Windows and Unix systems.
>EXPECT MORE WIDESPREAD adoption of EDA tools designed for 130- and 90-nm process technologies. The move to 130-nm process technology didn't take off as fast as expected in 2002. Manufacturing and yield problems stemmed from the inability of existing EDA design tools to produce clean results. Some design houses stuck to 150 nm and it paid off. Mainstream microprocessors will switch to 90-nm processes in the second half of 2003.
>ASSERTION-BASED VERIFICATION will gain significant ground in 2003. Look for the Open Verification Library and Sugar 2.0 to be the rallying points, owing to their growing popularity and compatibility with existing Verilog and VHDL-based simulation environments. More than 25% of designers will begin using assertion-based verification in their next design starts.
>TOOL INTEROPERABILITY is a long-held dream for EDA users. The Silicon Integration Initiative's OpenAccess Coalition is building momentum for standardizing a common database and application programming interface. Look for the movement to pick up lots of steam in 2003 as new members and important collaborators come onboard.
>THE PUSH FOR PHYSICAL INFORMATION at the register transfer level (RTL) continues with increased emphasis on RTL analysis. ASIC vendors will begin to insist that the netlists they receive from customers are run through tools that check the RTL for compliance with their process requirements.
>PLATFORM-BASED DESIGN is generally thought of as centering on a processor core and associated peripherals, but new approaches to on-chip interconnect schemes could alter that perception. Expect increased use of methodologies that involve bundling intellectual property into large functional blocks coupled with separation of the I/O for those blocks from their functionality.
>TOOLS WILL EMERGE that combine all types of simulation engines, from large-signal model simulation, to digital simulation, fast Spice simulation, and RF simulation, into a single environment. These tools will let designers freely combine VHDL, Verilog, VHDL-AMS, Verilog-AMS, Spice, and C anywhere in the design.
>BONDS WILL BE FORGED to create a transaction-level hardware/software co-development methodology. Hardware modeling, performed as simulation or emulation, will be closely tied to software modeling in the form of instruction-set simulation at a cycle-accurate level. SystemC will become the standard-bearer for architectural and system modeling, and facilitate co-simulation with HDL and SystemC platform-based design. In the architectural modeling arena, look for metrics to evolve for evaluating such attributes as throughput, bandwidth, and memory utilization.