I was recently at Design East in Boston. The show is changing with more of a slant towards software and hardware tools. I met with a number of companies like AdaCore and LDRA. One topic we always tend to talk about is the challenge of convincing C/C++ programmers that static and dynamic analysis tools are more than expensive tools or that Ada is more than a military programming language.
One idea that came up a number of times in slightly different terms is that many programmers think other people make mistakes but if they do then they can find and fix them or they will be unimportant. The bottom line is "I don't make mistakes."
Of course no one would admit to this. Anyone who has programmed knows that mistakes occur and that bugs can cause major problems. The problem with tools like C and C++ is that they are powerful but it is all too easy to shoot oneself in the foot.
I can speak with some authority on this because I spent a good deal of time coding in C many years ago. I have used C++ but not as extensively. Back then the only static analysis tool was Lint. Things have changed significantly since then.
Standards like MISRA are commonly supported (see MISRA C: Safer Is Better). Static analysis tools can also be used to address security (see Can Static Analysis Address Security Issues?) as well as concurrency issues (see How Static Analysis Identifies Concurrency Defects).
Another thing that has changed is the amount of computing power available to the tools. Static analysis needs storage and compute cycles as the sophistication and depth of analysis increases. The latest PCs have that in spades. The problem is that the tools do not do anything if they are not used.
Ok, I am trying to convince you to use them. Most of the good ones will cost a good bit of cash but they payback is significant. The cost of finding and fixings bugs has not really changed a lot. Figure a factor of ten in cost for each step software moves away from the programmer.
The big difference is that it is now possible to significantly reduce the number of bugs in the field. It has to do more with using the right tools rather than whether you don't make mistakes because you will.
So what about Ada?
I know that few C/C++ programmers will switch but it is not as if I have not made the recommendation before (see C Programmers, Time To Try Ada).
I can also say that I have done less programming in Ada than with even Python or PHP which I have done quite a bit with in the last few years. Still, programming languages is an area I study and Ada is what I would pick over C or C++ for embedded projects.
Why? Because it is easier to write programs without bugs with Ada compared to C++ and definitely better than C.
Of course, the availability of tools is sometimes and issue. If there is a new platform then you can count on C being available. C++ might be available especially since GNU is often the base for the toolset and C++ comes along with C. This means that Ada might not be an option but most static and many dynamic analysis tools will be.
One session I did want to mention was Undercover C++: What's Efficient and What Isn't by Stephen Dewhurst from Semantics Consulting. It was great and it reminded me to think about all the issues like why tools like static analysis are worthwhile.
One of the issues brought out in the session dealt with the overhead of virtual functions. It is true that virtual functions have more overhead than non-virtual functions but consider the alternative when dealing with polymorphic objects. The alternative is tagged objects and using a case statement in really bad ways with more overhead than the double indirect virtual function approach.
So take a fresh look.