Determinism Means More Than Faster Processors

Feb. 16, 2006
Abounded response to events is the key to defining a hard real-time system. Real-time systems require determinism to ensure predictable behavior. Without determinism, systems can't be called real-time. And without bounded determinism, systems can't be cl

Abounded response to events is the key to defining a hard real-time system. Real-time systems require determinism to ensure predictable behavior. Without determinism, systems can't be called real-time. And without bounded determinism, systems can't be classified as hard real-time. Full-featured operating systems (OSs) such as Windows aren't deterministic, but some designers may believe that faster processors can achieve a semblance of determinism. The level of determinism required is a function of the frequency of the real-time events and the effect of delays on system dynamics—that is, how often events occur and how quick the system must be in response to those events. The ability to place a finite and acceptable bound on the value of these numbers distinguishes a "hard" real-time system from a "soft" real-time system.

Faster processors, memory, and peripherals improve the aggregate performance. But generally, they don't affect a system's bounded determinism. Given the way that OSs such as Windows assign priorities and manage tasks, a faster processor may not change the worst-case response time to an event. Increasing speed can decrease jitter—the spread and intensity of the variations in response to an event. Yet it won't eliminate jitter, especially worst-case jitter. Misplaced priorities assigned to software tasks (especially drivers) usually cause worst-case jitter, not hardware.

Improving the performance (or speed) of a system is useful. More performance allows an increase in the complexity of algorithms that can be implemented in a given period of time (that is, within a sample interval). Thus, a faster system can improve the quality of the control and data-acquisition system you can implement in software. But bounded determinism still is required to ensure that a stable and accurate system can be deployed, regardless of the performance level of that system.

So what do you do when you need the human-directed resources of an OS such as Windows and a measure of real-time determinism? Try a real-time operating system (RTOS) that works with Windows. But you don't want an RTOS that runs inside of Windows, which isn't deterministic. You can't ensure deterministic processing from within Windows. Instead, you need an RTOS that runs alongside Windows.

You want to create an environment of multiple virtual machines (VMs) running on a single physical CPU, where Windows runs unmodified on one VM and the RTOS runs on the other. This setup uses the CPU's hardware to protect each VM from affecting the other.

With separate VMs, you can contain each OS to ensure that runaway processes in one OS never affect those in the other. A VM approach lets real-time applications run in user-mode, not kernel-mode. The result is improved reliability and robustness because real-time processes run in separately managed memory segments.

These segments are distinct from those that are used by Windows. Also, they provide address isolation and protection between real-time processes and nonreal-time Windows code.They simplify development and debugging as well because all processes are run in protected user-mode, rather than unprotected kernel-mode. There's no speed penalty, only increased reliability and safety!

The VM approach also makes it very easy to host existing applications that have both real-time and human-directed components on a system that uses a dual-core CPU. One of the processors can host the RTOS, and the other can host the human-directed OS. Dual-core CPUs typically provide a mechanism for signaling between software running on one processor and the other. Designers should look for OS environments that support such inter-environment communications.

Faster processors can't make full-featured or human-directed OSs more deterministic. But adding an RTOS that's designed for determinism and also for compatibility with the other OS can solve the problem.

See Associated Figure

Sponsored Recommendations

What are the Important Considerations when Assessing Cobot Safety?

April 16, 2024
A review of the requirements of ISO/TS 15066 and how they fit in with ISO 10218-1 and 10218-2 a consideration the complexities of collaboration.

Wire & Cable Cutting Digi-Spool® Service

April 16, 2024
Explore DigiKey’s Digi-Spool® professional cutting service for efficient and precise wire and cable management. Custom-cut to your exact specifications for a variety of cable ...

DigiKey Factory Tomorrow Season 3: Sustainable Manufacturing

April 16, 2024
Industry 4.0 is helping manufacturers develop and integrate technologies such as AI, edge computing and connectivity for the factories of tomorrow. Learn more at DigiKey today...

Connectivity – The Backbone of Sustainable Automation

April 16, 2024
Advanced interfaces for signals, data, and electrical power are essential. They help save resources and costs when networking production equipment.

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!