Image

What's The Difference: Ada Then and Now

Feb. 27, 2013
Ada has evolved over the years taking in the latest programming ideas from object oriented programming to contract-based programming.

This article is part of Then and Now and Ada and SPARK in our Series Library

The International Organization for Standardization recently approved the latest revision of Ada, known as Ada 2012, close to 30 years after the very first version. The language’s evolution shows how Ada has accounted for major software technology developments of the past several decades while retaining the original aims and flavor of its initial design.

Table Of Contents

Ada 83: A Portable, Modern High-Order Language

Ada 83 was the result of an international competition organized by the U.S. Department of Defense (DoD), and it was designed to replace the more than 500 programming languages that were then in use within the DoD. It was a general-purpose programming language intended for large, long-lived software systems, so two basic criteria influenced every feature of every subsequent version.

First, the language must help programmers detect errors in their code as early as possible. This requires a strong typing mechanism and imposes on the programmer the responsibility of providing precise declarations for types and subtypes—a novel notion at the time. In return, the compiler can recognize more mistakes in the code, and the runtime can include checks that declarations are obeyed by dynamic values. Adverse reactions to this point of view were usually stated as “I don’t need no stinkin’ checks.”

Second, the language favors the reader over the writer. For long-lived systems, it is more important to have well-structured programs that can be understood by others than terse programs that save keystrokes and obscure their purpose. As a result, the language has a rich syntax (“verbose” to its detractors) that allows programs to “read better” on the page.

The original design also incorporated the ideas of the nascent discipline of software engineering: modularization, information hiding, and separation of concerns. The result was a novel language with an ambitious set of features:

  • Programs organized around packages: A package has a specification and a body. The specification contains all that a client needs to use the services offered by the package, while the body contains the implementation of these services.
  • Scalar types, array types, record types (like C constructs), access types (like C pointers but with type safety), and private types: Array types are self-describing. (They carry their bounds.) Record types can be parameterized by means of discriminants. Scalar types and access types are named, and type checking is by name equivalence (not structural equivalence). Type checking is performed both within a compilation unit and across units.
  • Private types that provide data encapsulation: They specify a set of applicable operations, but the implementation of the type and its operations is opaque to the client.
  • A generic mechanism providing parameterized packages and subprograms: Generic units can be specialized by explicit instantiation.
  • Concurrency supported within the language: Tasks are independent units of execution that communicate by synchronizing and exchanging data through rendezvous.

The result was a large language by the standard of the times, and it is fair to say that it stretched the abilities of compiler writers and the capacities of most 1980s-vintage hardware. This delayed the spread of the language by a few years. Several robust compilers were available by 1987, and Ada started to see significant adoption, mostly within aerospace and military applications. DoD programming language policy at the time encouraged Ada usage, and there were also some significant Ada projects outside the military-aerospace domain (e.g., steel mill control, transport systems).

Ada 95: Here Come The Objects, And More

In the late 1980s, object-oriented programming became the new paradigm in software construction, and C++ became increasingly popular. Ada 83 had a limited view of inheritance (called type derivation) but no notion of type extension. A review of the language that started in 1990 led to a major revision, culminating in a new standard: Ada 95.

The most important enhancements offered by Ada 95 were:

  • Object-oriented programming, supported by the notion of tagged types and type extensions, primitive operations, polymorphism, and dynamic dispatching.
  • The organization of packages into hierarchies (child units) to provide a better notion of software subsystem.
  • A new construct for data synchronization, protected types, that generalized the older notion of monitor.
  • A fully defined interface to other languages, in particular FORTRAN, C, and COBOL.
  • A comprehensive predefined library, including packages for character and string handling, mathematical processing, and command-line processing.
  • Annexes that address specialized application needs for systems programming, real-time systems, distributed systems, information systems, numerics, and safety and security.

It is worth emphasizing that Ada 95 offers two complementary mechanisms for software evolution. Type extensions are akin to the classes of other O-O languages, but child units provide a separate mechanism for adding functionality to an existing system, and typically a package (parent or child unit) will declare several related types. The basic software component is a package. Types or classes are too small to serve this purpose.

The evolution of Ada reflects the evolution of the software ecosystem. We are building larger and more complex systems. These systems are often aggregates of components written in several languages. So in addition to mechanisms to link components within Ada, we also need a way of interfacing to foreign components. Ada 95 introduces interfacing packages that declare types whose representation must match common types in these other languages, as well as the parameter passing conventions that must be used to invoke foreign subprograms.

To declare these interfacing types, the language needs to describe precisely (down to the bit level) the required data layout. Ada has had from the beginning such representation clauses, and it’s worth noting that as a result it is truly a wide-spectrum language that can be used at multiple levels of abstraction.

Ada 2005: The Joy Of Interfaces

The next revision of the language saw the light a decade later. Ada 2005 brought a smaller set of enhancements than its predecessor but still introduced some important functionality.

First, interface types (borrowed from Java) provide multiple inheritances. A type can now have one parent but multiple (interface) progenitors. Among its novel applications, interfaces unify tasks and protected types that can be defined as implementations of a given synchronized interface.

Second, a new visibility mechanism (the “limited with” clause) allows the declaration of mutually dependent package declarations. Compilation dependencies previously had to constitute a directed graph without loops.

Third, the original design of the language excluded the creation of subsets, and the official compiler validation test suite was intended to enforce the “no-subset” rule. However, for hard real-time purposes, the concurrency model of Ada is too rich and has too much implementation freedom. This makes a typical multitasking system hard to analyze in terms of deadlock, priority inversion, and other ills that concurrent systems are prone to. Ada 2005 includes the definition of a subset of the language’s concurrency features, the Ravenscar profile, which requires a smaller runtime than the full language and supports the construction of completely deterministic systems.

And fourth, the predefined library was enhanced with additional numerics support (vectors, matrices, etc.) and an extensive containers facility.

Ada 2012: Contract-Based Programming For Today’s Software Challenges

Like its predecessors, the latest revision of the language, which became an ISO standard in December 2012, addresses two sets of concerns: expressiveness and safety. For the first, there are several new expression forms, convenient iterators over containers, mechanisms for mapping tasking programs onto multicore architectures, and other improvements. However, the enhancements related to software safety are likely to be more significant in the long run. The language now includes a mechanism to specify assertions known as aspects (see the code).

package Utilities is
   procedure Swap( Left, Right : in out Integer )
   with
      Post => Left=Right'Old and Right=Left'Old;

   function Factorial( N : Integer ) return Integer
   with
      Pre  => N in 0..12, -- 13! overflows 32 bit integers
      Post => 
        Factorial'Result = (if N=0 then 1 else N * Factorial(N-1));
end Utilities;

package body Utilities is
   ... -- Bodies of Swap and Factorial go here
end Utilities;

with Utilities;
procedure Testing is
   I, J, M, N : Integer;
begin
   ... -- Initialize I, J
   Utilities.Swap(I, J);
   ... -- Initialize N
   M := Utilities.Factorial(N);
   ...
end Testing;

Preconditions and postconditions can be provided for subprograms. A precondition establishes that the program state (the subprogram’s input parameters and global data) must obey a stated condition for the subprogram to operate correctly. A postcondition establishes that the result of the subprogram, which includes the state of output parameters, obeys some other stated condition.

Type invariants stipulate that the internal state of a private type obeys some condition. A type invariant must hold whenever an object of the type is created or modified by a client-visible subprogram—that is to say by code that is external to the package that defines the type.

A subtype predicate defines a subset of an existing subtype. Only values within that subset are valid values of the subtype. Iterations over the subtype omit values that do not satisfy the predicate.

These constructs, collectively known as “contract-based programming,” aren’t completely novel. Some of them were included in Eiffel decades ago. But it is the first time that they appear as first-class citizens in a mainstream language, and they correspond to an interesting point in the evolution of programming. On the one hand, there has been remarkable progress in program analysis tools in the last decade. On the other hand, there is intense concern in the world at large about the safety and security of all software systems.

The field of high-reliability software used to be a niche that was mostly occupied by the aerospace industry and had developed stringent procedures for the development and certification of software systems such as the DO-178B standard for commercial avionics. The last few years have made it clear that complex software touches our lives constantly. The safety of financial software, automotive software, and medical software is as vital as that of air traffic control software, for example.

From the beginning, Ada has included constructs designed to make programs more trustworthy. For instance, specifying the size of an array and ensuring that an array object is self-describing makes the purpose of the object clearer, but it also allows the compiler or the runtime to catch misuses of the object. A type or a subtype declaration is in fact an assertion about the behavior of a piece of software, and the compiler uses the assertion to verify that the code makes sense. A compiler is among other things a program analysis tool that can verify some simple properties of a program.

Other program analysis tools perform a deeper analysis of a program and can determine, for example, that an uninitialized variable in one unit can cause another unit to malfunction. Typically this analysis is more complex than what is done in conventional compilers, and it includes global data flow techniques.

At a higher level of complexity, program verification tools include theorem provers and can ascertain that the execution of a program obeys some general postconditions. Interestingly, such formal tools were discussed a half-century ago, but never in the context of an existing programming language. They were always applied to very small languages with well-defined semantics, and formal proofs of correctness did not become widespread. The methods were too cumbersome, the semantics of the language not formal enough, or the application domain did not seem to require their use.

Ada 2012 is intended to change this trend. By exploiting the richer assertion mechanism added to the language, programmers can indicate their intent more precisely and explicitly—always a good thing! These annotations also provide additional information to the compiler, the program analysis tools, and the formal verification tools, which can then ascertain more precisely whether what the programmer wrote is consistent. The result can only be more reliable software.

In a way, from the programmer’s point of view, the difference between a compiler (leaving aside the fact that it generates code!), a program analysis tool, and a verification tool is maturity. Compilers have been around for much longer. But the modern programmer concerned with program correctness (as all should be!) will also use more complex tools. Today, Ada is uniquely positioned to fit in this continuum of techniques.

References

  1. Ada 2012 Language Reference Manual; the reference manual is available here in several formats.
  2. Ada Comparison Chart; this table summarizes the evolution of the major features of the Ada language.

>> Website Resources
.. >> Library: TechXchange
.. .. >> TechXchange: Embedded Software
.. .. .. >> Topic: Ada and SPARK

About the Author

Dr. Edmond Schonberg | Vice President

Ed Schonberg is co-founder and Vice President of AdaCore. An emeritus professor of Computer Science at New York University, he has played a key role in the Ada community for over 35 years. One of the principal developers of the first validated Ada compiler at NYU, Ed has contributed significantly to several major advances in the Ada language and compiler technology. He has been notably a driving force behind the development of the GNAT front end. His research interests include the design and implementation of programming languages, Software Engineering and programming methodologies, and chamber music.

Sponsored Recommendations

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!