Earlier this year, Barr Group’s Embedded Systems Safety and Security Survey uncovered that of the embedded systems projects currently under development, approximately 25% of the designs with internet connections are part of what we call “The Internet of Dangerous Things”—projects that can kill or injure someone. We also asked these designers if security requirements are part of their design spec. A shocking 22% said no. That’s right—22% of the designers of connected devices that can potentially kill or injure a person don’t have any security requirements for their projects at all. This is a problem.
Mirai. Brickerbot. Stuxnet. These are just a few of today’s viruses and worms to attack vulnerable embedded systems and IoT devices, and the list is growing. Now, we know that for legacy systems, our ability to patch every potential point of entry will be impossible. However, for all current and future embedded systems, there’s no excuse. Security is now a necessity. And as a designer of embedded systems, choosing to ignore security in our projects makes us part of the problem.
So how can we fix this? Here are six points to consider:
1. Don’t Ignore Security!
As stated in the Association for Computing Machinery (ACM) and IEEE’s code and rules of ethics, as professional embedded-systems engineers, we have an ethical duty to NOT ignore security. In ACM code of ethics Rule 1.2, entitled “Avoid Harm to Others,” the document emphasizes the importance of using best software practices. Also discussed is the necessity for engineers to assess the “social consequences” of the systems, and our obligation to “blow the whistle” when either members of the development team or management intentionally neglect to take action to correct a product’s known safety-related risks. This is especially important for projects where end users could be killed or injured due to product failure.
IEEE Rule 1 clearly states that engineers must accept personal and professional responsibility for the safety, welfare, and health of society—or at the very least, for the users of their products and those potentially endangered by it. These days, consumers assume that the products they’re purchasing are safe and secure. As professionals, we’re obligated to protect the users of our designs. It’s time we live up to our responsibilities.
2. Adopt Proven Industry Best Practices
Coding standards, code reviews, and static analysis are just three of the many industry best practices that embedded systems engineers can use to develop safer, more secure devices. They’re also among the more impactful, yet inexpensive to implement and use on a continuing basis. By integrating just these three practices, you will reduce bugs in your systems. And by reducing bugs, you will reduce the number of weak links in the security chain. This will make it more difficult for hackers to conduct attacks successfully.
Many companies choose to have just a handful of engineers focus on the security of their systems. However, what has been found to be more effective is educating your entire team on industry best practices. If engineers on your team more readily adopt these best practices from the beginning of product development, your final product will be more robust.
3. Use Cryptography
In this year’s survey, we found that less than half of those who were concerned about security in their systems were encrypting their data. For those designing potentially dangerous devices, this is even more concerning. When designing safety-critical devices—devices that can injure or even kill their users—engineers at the very least should be encrypting their data to protect their users.
4. Secure Your Bootloader
In “The Flaming-Printer Attack,” a hacker invaded thousands of unsecure printers to demonstrate the vulnerability of those devices. In this hack, susceptible printers could accept a download, firmware that mimicked a postscript file, without any authentication that it was sent from the manufacturer.
So, what’s the lesson here? Build a secure bootloader. Ensure that the data is not only encrypted, but also that the data is signed and authenticated from you. Also, make it possible to update the firmware and respond to attacks and threats in real time. While these design additions may not be perfect, they will help to prevent your device from accepting just any download and combat any attacks if and when the time comes.
5. Practice Defense in Depth
Practice defense in depth. Go through the different scenarios as to how and why a hacker may get access to your system. Perhaps they’re able to attack because of a weak link in the system. Perhaps the gatekeeper is under distress and can be compromised. No matter how or why a hacker is able to break in, think about it. Plan for it. Design for it. That is our best chance at creating a more secure environment for our users.
6. Stay Educated
And finally, stay educated about security. Pay attention to not only what is going on in the embedded space, but also learn about attacks happening in the desktop computing world and in the smartphone computing world. These adjacent spaces are where many trends begin. Know about the latest bots, viruses, and malware that are invading the internet. Understand that these attacks are not “someone else’s” problem. It’s a problem that we must all face and combat as an industry.
To help in the effort to develop safer internet-connected designs, the Barr Group has made available its Embedded C Coding Standard as a free downloadable PDF. The rules of the Barr Group coding standard are intentionally compatible with and complementary to the stricter MISRA-C Guidelines for the Use of the C Language in Critical Systems. Click here to access the PDF.