Exposure to multiple languages can be useful, but does the language matter?
I started my programming career in high school where I learned BASIC on a time-sharing system via a Teletype machine complete with a paper tape punch and reader. I also picked up a rudimentary assembler for a simple virtual machine. I wound up working on COBOL and FORTRAN compilers that were written in BASIC because that is all the time-sharing system exposed. This was the start of my interest in programming languages. Over the years, I have used everything from APL, Algol, and ADA to Lisp, LabVIEW, and Visual Basic. C and Java are also in the mix.
Having a wide choice of programming languages has often been the norm for mainframes, minicomputers, and microprocessors that were targeting businesses, although there was often a dominant or preferred language. Microcontrollers used to come only with macro assemblers, but these days C (and possibly C++) is the minimum provided by chip vendors. Unique processor architectures tended to limit tool support, but these days ARM and x86 platforms are providing language developers with fewer architectures to support while increasing the number of targets.
Another aspect for embedded development is the increase in processor performance, the amount of memory available and standardization on 32- and 64-bit platforms. This allows languages like Java, Lua, or Python to be used in deeply embedded systems.
The definition of “deeply embedded” is often where arguments about language suitability begin, because embedded applications covers a lot of ground these days—from motor control to IoT gateways to process control systems. Real-time systems with tight timing requirements may limit languages that can be used, but developers typically have more choices than just C (even for bare metal applications).
Applications that use Linux have almost any programming language available in one form or another. At that point, does the programming language matter?
Another way of looking at this is to ask whether C should be the language of choice or another language should be used. For example, would Python be more useful because of the power it gives developers compared to C? Many will point to C++ because of its compatibility with C, but C++ should really be considered with other languages like Java, Python, and Ada because it shares functionality with this class of languages. It is possible to use C++ as a better C, but that doesn’t utilize the former to its full advantage.
Of course, other considerations often have more impact on which programming languages might be used for a particular application. For example, the use of deep neural networks (DNNs) is increasing. Python is one language that has become popular in this space. It may be advantageous to use Python if it provides better support for a DNN framework and can meet the rest of an application’s needs.
C is still the language of choice for many embedded applications, but that is often due to inertia rather than being an optimal (or even preferable) choice. Issues such as maintainability, software reuse, and project size should be considered along with the ability to create reliable and secure code.
Embedded programmers are being tasked to use and manipulate more complex data using more communication than ever. Security, reliability, and maintainability are now critical aspects of a design. C can and is being used, but it is far from ideal.