Performance Is A Plus, But Debugging Dogs Developers

March 29, 2007
Developers are looking to new solutions and approaches to test and debug increasingly complex software without compromising quality or schedules. The old methodologies, based around debugging and testing the software on physical hardware, are no long

Developers are looking to new solutions and approaches to test and debug increasingly complex software without compromising quality or schedules. The old methodologies, based around debugging and testing the software on physical hardware, are no longer adequate.

Adding urgency to the quest for new tools is the transition to multicore processors, which allow for better computing performance while keeping power consumption low. However much they add to computing performance, though, these processors also add a whole new level of development and debugging complexity.

Developers widely acknowledge that debugging parallel programs isn't easy. Provoking and reproducing problems in parallel programs is much more difficult than it is in single-threaded programs on single processors. In classic debugging, most bugs are deterministic and caused by particular variations of input. Such deterministic bugs will still occur in each task in a parallel system and will be solved in a traditional manner.

NEW BUGS
The parallel program adds a new category of bugs that are caused by the interaction of multiple tasks, often depending on the precise timing of their execution, memory accesses, and communications. These are the most difficult bugs to provoke and reproduce, not to mention understand and resolve.

In addition, some bugs don't become apparent since the precise timing necessary to make them happen is rare. It would be unusual to have someone deposit money into a bank account at precisely the same moment that someone else withdraws cash. But this just means that if the locking isn't handled correctly, such a problem can lurk for a long time before it appears, perhaps catastrophically.

The most promising approach to multicore software development is to move from executing software on the physical hardware to a virtual model of the hardware. Virtualized software development environments are functionally accurate, meaning that they run unchanged production binaries.

Developers familiar with lower-level simulators might assume that simulation technology underlying virtualized software development is too slow. But advances such as justin-time compilation and acceleration of idle time, coupled with fast and inexpensive workstations, mean it is possible to simulate complex systems at peak speeds measured in billions of simulated instructions per second—performance adequate for the edit-compile-debug loop that makes up a large part of a programmer's daily work.

VIRTUALIZED REALITY
Virtualized software development synchronizes all processors and other devices, making it possible to halt the whole system simulation when one part of the system is stopped, for example, at a breakpoint. This is not the case with physical hardware. With complete observability of any aspect of system state, and with much better control, the virtual system is simply a much better basis for developing multicore applications.

Also, virtualized software development can reverse-execute code. This is a powerful debugging feature since it becomes possible to simply wait for an error to occur and then run backward to determine the cause—again, a feat impossible in traditional hardware-based development methodology.

Since physical hardware can only run time forward, developers must go through the time-consuming process of rebooting the system from the beginning and rerun it to just before that procedure call. But even that doesn't guarantee bug reproduction in a complex system, which isn't even deterministic.

In contrast, reverse execution enables the system to be stepped back into a fault condition after the system has run over it. It reverses the whole system, even from "unrecoverable" errors such as operating-system crashes, kernel panics, segmentation faults, and accidental file deletions.

Virtualized software development and test makes programmers more productive by making everything observable and controllable in a way that hardware does not. It also leads to higher quality by allowing more testing to be automated, changing hardware configuration or testing situations that are impossible to create with hardware.

Furthermore, virtualized software development is one technique for addressing both the complexity of software and the additional problems that multicore architectures bring. In fact, many companies with state-of-the-art development processes are planning all their software development and test ahead of hardware availability, with first-customer-ship just days after delivery of the first hardware, thanks to virtualized software development.

Sponsored Recommendations

What are the Important Considerations when Assessing Cobot Safety?

April 16, 2024
A review of the requirements of ISO/TS 15066 and how they fit in with ISO 10218-1 and 10218-2 a consideration the complexities of collaboration.

Wire & Cable Cutting Digi-Spool® Service

April 16, 2024
Explore DigiKey’s Digi-Spool® professional cutting service for efficient and precise wire and cable management. Custom-cut to your exact specifications for a variety of cable ...

DigiKey Factory Tomorrow Season 3: Sustainable Manufacturing

April 16, 2024
Industry 4.0 is helping manufacturers develop and integrate technologies such as AI, edge computing and connectivity for the factories of tomorrow. Learn more at DigiKey today...

Connectivity – The Backbone of Sustainable Automation

April 16, 2024
Advanced interfaces for signals, data, and electrical power are essential. They help save resources and costs when networking production equipment.

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!