Full-System Simulation: Escape From Reality

Sept. 20, 2004
A major transformation in complex system-level design is under way, as ever-increasing numbers of electronic systems are being implemented as software. The critical progression for the delivery of a typical working system has moved away from...

A major transformation in complex system-level design is under way, as ever-increasing numbers of electronic systems are being implemented as software. The critical progression for the delivery of a typical working system has moved away from finalizing the hardware (or chip) design to finishing the software development and then completing the system integration.

As the task of developing software increases in complexity and length, the system team faces two major challenges: Software development must begin earlier and can no longer wait for hardware development to be completed, and the cost of the traditional hardware-based approach to testing software is becoming both prohibitively expensive and unwieldy. Using full-system simulation to virtualize the product development process solves these problems.

To implement a virtual approach, a software model of the system, known as "a virtual platform," is built and run on an underlying simulation environment. The virtual platform must have both fidelity and performance. Fidelity means the software "cannot tell the difference," so unmodified binaries of the code run on the virtual platform. Performance must be high enough that software developers enthusiastically prefer to use the virtual platform to increase their productivity and loop quickly through the edit-compile-debug cycle (unlike co-verification environments).

To be effective, the virtual platform must simulate the system being designed and enough of the surrounding environment to model real-world use--in effect, a virtual test rack. For example, if a set-top box is being designed, it is not enough to model just the set-top box. You may have to model several set-top boxes, a central server delivering video-on-demand, and perhaps a PC to administer the system and a server to handle billing.

Past attempts at virtualization failed because performance fell short. Today, technical improvements in simulation techniques and the availability of inexpensive high-performance PCs make virtualization the approach of choice over hardware-based methods. For example, developers can simulate a 24-processor, 8-Gbyte, 64-bit enterprise server on a plain vanilla laptop. Virtualization is also increasingly scalable. It's now possible to simulate thousands of systems on a network of inexpensive workstations.

Virtualization has two major advantages. First, a virtual approach to system development is more cost-effective than real hardware, immediately reducing the amount of capital necessary to dedicate to software development and testing. This is especially true for organizations with vast product development teams, because the usual economics of software come into play. While the first instance may be expensive, duplicating software is inexpensive. Every engineer who needs access to the virtual platform can have it, unlike real hardware, which is always scarce.

Second, the virtual approach is simply more attractive than real hardware. In the real world, it is difficult to stop hard drives from spinning, or set a breakpoint on a control register access in the kernel, or confidently trace exactly which code affects a specific sensitive data structure. Good simulators can do all those things, and more--like stopping at an arbitrary point in the workload, saving the entire state to a set of files, and then firing off multiple clones with different fault scenarios injected.

Many problems in today's systems occur during interactions between different products or multiple instances of the same product. These areas are most difficult to test with real hardware, which is notorious for "heisenbugs" (problems that disappear when you try to take a closer look at them). By being fully deterministic, the virtual approach greatly simplifies the tracking of these bugs. You can go back in time until just before the bug occurs and investigate the internals at any level of detail necessary.

As inexpensive workstations become ever faster, and as simulation techniques continue to improve, the compelling advantages of a virtual approach to advanced system development and testing will only increase.

Sponsored Recommendations

Near- and Far-Field Measurements

April 16, 2024
In this comprehensive application note, we delve into the methods of measuring the transmission (or reception) pattern, a key determinant of antenna gain, using a vector network...

DigiKey Factory Tomorrow Season 3: Sustainable Manufacturing

April 16, 2024
Industry 4.0 is helping manufacturers develop and integrate technologies such as AI, edge computing and connectivity for the factories of tomorrow. Learn more at DigiKey today...

Connectivity – The Backbone of Sustainable Automation

April 16, 2024
Advanced interfaces for signals, data, and electrical power are essential. They help save resources and costs when networking production equipment.

Empowered by Cutting-Edge Automation Technology: The Sustainable Journey

April 16, 2024
Advanced automation is key to efficient production and is a powerful tool for optimizing infrastructure and processes in terms of sustainability.

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!