The Holy Grail of Unified Coverage: What’s the Reality?

May 19, 2010
Though Accellera's Unified Coverage Interoperability Standard has made strides, the dream of unified coverage metrics spanning both formal verification and simulation remains unfulfilled. Jasper's Rajeev Ranjan cuts through the hype in this Viewpoint.

Meaningful, combined metrics across simulation and formal verification

Verification engineers and their managers are intrigued by the idea of a unified coverage standard that can combine data from heterogeneous sources (different verification methods, tools from different vendors, etc.) not only into a single database, but into a single integrated coverage measurement and metrics system. This is a wonderful dream that is best approached step-wise.

In 2006 when I co-founded the Coverage Interoperability Forum (CIF), the precursor to Accellera’s Unified Coverage Interoperability Standard (UCIS) technical sub-committee, we focused first on a manageable deliverable: a single database repository. This by itself was a lofty goal, and I felt it was vital to bring everyone together to pursue this effort.

Today, I am happy to report that we have made significant progress within the UCIS subcommittee, but I believe it is important to relate exactly what the unified database enables, and does not enable, with today’s simulation and formal verification technology and methodology.

UCIS is still working on a draft standard, based on a donation from Mentor Graphics, to enable reading and writing coverage data from simulation as well as formal verification. It is expected that the draft will be published later this year. Having this data stored and accessible through a common API will be helpful to verification teams, but will not create a complete “unified” coverage standard. The notion of achieving unified coverage from different verification methods by simply combining some coverage metrics in a database is too simplistic. For meaningful, combined metrics across simulation and formal verification (see the figure), there is no known practical or technically sound way to achieve this today.

I have observed a misconception within some parts of the verification community that having both formal and simulation tools from a single vendor provides them with “automatic” unified coverage, and hence coverage closure can be facilitated by a vendor tool supporting both dynamic simulation and formal verification beneath the same umbrella. It should be noted that having a standard API makes it a fair playing field for point-tool vendors, i.e., a vendor with a verification platform consisting of both formal and simulation tools has no additional advantage. Also, a standard API to store coverage information in the same database (within different buckets) does not imply that the coverage data can simply be merged indiscriminately.

In reality, an effective means of measuring overall verification coverage with both formal and simulation metrics remains elusive, regardless of which tools are used. The traditional coverage metrics used in simulation-based approaches are not at all applicable for formal. Similarly, concepts in absolute correctness in formal have no counterpart in the simulation world. Now, even though the metrics cannot be combined directly, it is possible to leverage simulation and formal in a number of different and useful ways.

Different verification methods can be leveraged in a top-down way by appropriately including them in the verification plan, and in a bottom-up way by using the methods to provide results for one another. When creating the APIs for coverage data, it is important to bear in mind the differences between these two classes of tools and ensure that the mechanisms exist to convey all of the available information across the interface.

In the top-down view, simulation and formal can be leveraged in a systematic way as dictated by a verification plan. Effective verification planning adds predictability in the verification flow by specifying exactly what needs to be tested. Planning also provides the necessary structure to identify key areas where complementary verification technologies can be applied effectively. By making sure that the verification tasks are tied to the specification, and assigning the appropriate verification method (simulation or formal analysis) to each such task, one can significantly reduce, if not eliminate, any redundant verification effort.

In the bottom-up view, formal analysis and simulation methods can cooperate to help improve overall coverage. For instance, formal analysis can establish that certain coverage targets are not reachable for any legal input sequence. This eliminates unnecessary simulation effort targeting those areas. In addition, formal analysis can create specific scenarios automatically, which when simulated will fulfill specific coverage targets. Another example where formal analysis can reduce the simulation effort is when a design block is fully verified for a specific mode (perhaps the most complex one). The simulation effort can then be targeted to provide coverage of other areas. On the other hand, simulation can provide a sequentially deep ‘seed state(s)’ (a set of states that serve as the starting point for formal analysis). It would be practically difficult (if not infeasible) to reach these seed states using formal analysis.

Being involved with these standard efforts from the beginning has clearly shown me how challenging it is to merge metrics from various simulators and formal verification tools into a single, usable database. Bottom line:  we are far from the “holy grail” of a unified coverage database—the meaning of metrics still need to be defined, and the setups and data sets are just too different. At the same time, there is synergy between the tools as outlined above, but only because they are leveraged to maximize their effectiveness as part of an overall verification plan, not because we have discovered a hidden path to coverage enlightenment.

Sponsored Recommendations

Near- and Far-Field Measurements

April 16, 2024
In this comprehensive application note, we delve into the methods of measuring the transmission (or reception) pattern, a key determinant of antenna gain, using a vector network...

DigiKey Factory Tomorrow Season 3: Sustainable Manufacturing

April 16, 2024
Industry 4.0 is helping manufacturers develop and integrate technologies such as AI, edge computing and connectivity for the factories of tomorrow. Learn more at DigiKey today...

Connectivity – The Backbone of Sustainable Automation

April 16, 2024
Advanced interfaces for signals, data, and electrical power are essential. They help save resources and costs when networking production equipment.

Empowered by Cutting-Edge Automation Technology: The Sustainable Journey

April 16, 2024
Advanced automation is key to efficient production and is a powerful tool for optimizing infrastructure and processes in terms of sustainability.

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!