Rob And Joerg

A talk with the Trusted Computing Group on Cybersecurity

June 19, 2020
The Trusted Computing Group maintains a library of specifications (most recently TPM 2.0) to combat the growing sophistication of cybersecurity threats worldwide.

When it comes to electronic systems and their protection, it’s almost surprising that it took so long before seeing significant industry-wide efforts to ensure safety, privacy, and security at the hardware level. This is not to say that there weren’t very good solutions available to those who took the effort, such as government and military users, but until recently there wasn’t as much attention at the consumer level.

This has of course changed in the recent years, and everyone is pretty cognizant of the importance of effective security against cyberattacks. Lately there have been organized efforts to establish security standards, to provide a common framework and protocol ensuring proper use and compliance to parameters.

One such effort is being undertaken by the Trusted Computing Group (TCG), who maintains a library of specifications (most recently TPM 2.0) to combat the growing sophistication of cybersecurity threats worldwide. The challenges facing the cybersecurity industry are only increasing as technological advances continuously create a more dangerous risk environment.

For example, the NotPetya malware attack in 2017 underscored how dangerous such attacks can be. It caused the global shipping firm Maersk significant financial damage and impacted companies and their operations worldwide.

TPM empowers applications, allows for more platform specifications to be built, and simplifies management by supporting additional cryptographic algorithms, while also providing additional capabilities to improve the security of TPM services. We recently sat down with Rob Spiger from Microsoft and Joerg Borchert from Infineon (both member companies), to talk about TCG and evaluation engineering.

EE: So let's just jump into this. When we talk about the whole aspect of security, it's a very big buzzword. It's almost like the blind men and the elephant. Everybody sees it from a different angle, so they all come off with a different perspective. Why don't you start, Rob? What are your thoughts on that?

Rob Spiger: When I think about the security that Trusted Computing Group works on, it's really defining a hardware root-of-trust that can be used to form the basis of establishing trust in a larger system. So there are hardware roots of trust for measurement, reporting, and storage, and you could use those with a couple of different technologies the organization has developed, to make assurances about what software is running on a device and that it has integrity, and that really needs to be built in from the ground up at the beginning of the boot process.

EE: That's a big aspect because up until very recently, security was considered almost a software-only aspect, but today if it doesn't include some hardware aspect, it's not really secure. What are your thoughts on that Joerg?

Joerg Borchert: We have started as a group relatively early in 1998, and then as the Trusted Computing Group was and instantiated 2003. In that journey, the hotline root of trust is the element to provide the trust anchor for the organization on one hand and for the users on the other hand.

EE: Now, that's one of the reasons that I wanted to bring you gentlemen into the conversation in context to the audience of evaluation engineering in that, because now security is also a hardware development issue, that presents challenges to the development engineer when it comes to evaluation and test and measurement. What's your take on it from that perspective, Rob?

Rob Spiger: I think the statement you made earlier is absolutely true. If you're looking at software and trying to figure out whether it has integrity or not, with just software alone, whatever software runs first really owns the environment and the machine. If you can have hardware that's able to reflect the software that's loaded, then you can do better evaluation of things like, if I load different software does my measurement set change? Can I see that the integrity differences are reflected in how I look at the health of the device? So I think those things can be really helpful in looking at it from a measurement standpoint and looking at the integrity of things can really improve security.

EE: So Joerg, what are your thoughts on that aspect of security now being part of the hardware development process?

Joerg Borchert: I want to add, the protection in hardware is, especially for personally-identifiable information such as user IDs and passwords, really critical, and also storage, the drives in systems. These are critical aspects, which cannot be trusted in software at all.

EE: Now the fact that it's in hardware, that presents a number of challenges, I would imagine in the systems integration and characterization, where would you start with that Joerg? I mean, as you sit down, from the perspective of the Trusted Computing Group and the capabilities you want the user to develop in their hardware, what are the challenges in taking your solutions and having them implemented in theirs?

Joerg Borchert: The main challenge in this process is the evaluation. Evaluation of hardware, evaluation of firmware, and evaluation of the system. This a process which we have to take very seriously, because if we have any open ends, this would challenge our all-systems approach. I would say that Rob can dig in deeper in this specific topic, we have a process which allows the evaluation for third parties, so that it's not just internal TCG technology.

Rob Spiger: I think that one of the things that can really help developers building solutions is using standardized hardware interfaces. I think TCG has helped find a lot of those for different technologies. Then with a standardized hardware interface, you can also have verification of compatibility with that interface, so the same software can be reused on different platforms in a consistent way, and you can lower the overall amount of validation needed to do by using software in multiple platforms.

I think another aspect of it is that you can have certifications from hardware manufacturers that software developers for making firmware can have assurances that they have something they're working with underneath them at the software layer and firmware that is standardized and will have specific functionality. Also, there are mechanisms TCG has for hardware implementers to provide certificates that can show that the manufacturer attests that they've done that implementation consistent with the standard, and may also potentially show that they've done things like a security evaluation of that hardware implementation with a third party. That could be a great starting point for developers to build something secure.

EE: So why don't you walk us through a short chain of events, let's say from walking in your front door with, say a design in my hand for a doorbell, and me making sure that someone can't hack pictures of my mailman?

Rob Spiger: Sure. That's quite a long phase, but I think the thing is you want to look at it in the context of a threat model and think about what the threats are going to be for that device to understand that as well. Then for each of those threats, think about the probability it will occur, and decide how much investment you're going to make as a developer to try to mitigate those risks.

Then, if you think about the things that could go wrong, and think about how hardware will help you recover from those things. For example, if you have a root of trust that's going to allow you to do updates of a device, even if some of the software is flawed at a higher layer, a core piece will verify updates. You may have a way that you can recover that device into a trusted state even if it does have a vulnerability that is exploited.

So, trying to think about how recovery could happen and reestablishing trust as something that's been compromised is really important. The hardware-based technologies that Joerg was mentioning to protect the keys or identity are very helpful, because if compromise occurs, you may not need to take that device back to a factory environment to establish trust in it. If hardware protects the keys that you lose, by updating the software and verifying it, you may still be able to use those same credentials that were originally provisioned on the device to continue using it normally after recovery processes happen.

EE: Okay, now what about in the field itself? Some companies like Microchip offer a complete chain of ownership from their factory floor to your factory floor. Does TCG get involved in any of that verification?

Rob Spiger: TCG doesn't get into the supply chain that much, I would say. I think it's a lot of vendors claiming compliance to the interfaces, the specifications TCG produces. TCG does have some certification programs around specific things like a trusted platform module, and it defines a certificate so a company can say that it's a TPM that they manufactured, that's probably the main area where TCG would help communicate the assurances of trust between vendors. TCG does have a list of certified PPMs and other certification programs, like storage or a trusted network connect interoperability.

Probably a good example of the way something like that can work with our technologies is we have something called a device identifier composition engine, that's intended to help start the measurement chain for very low-cost devices, because it doesn't consume a lot of resources to do that. What it does is it lets you have a certificate generated for each phase of the boot process. So at the end of the process, the device could use a TLS connection to a server to send a chain of certificates as part of the way it authenticates that connection. Each certificate in that chain can show the device manufacturer as well as integrity measurements about each software layer.

So if you think about a supply chain having different vendors build different portions of the software that floated during the boot process, you could see each of those vendor's software modules that due process progresses. The receiver of that certificate chain could help verify all of those participants in making the full software stack on that device, or at least enough of the software stack that you're able to have a perpetuated insurance, security policy enforcement you can trust, like applications that are loaded later, have been validated.

EE: Are there any other aspects of the Trusted Computing Group and the solution set that the engineering audience should know within the development context?

Joerg Borchert: I wanted to point out that we are working as a Trusted Computing Group also with standards organizations in this field, like NIST, and they also provide a framework about independent technology requirements, for example for resilient platforms, and we are working very closely here together with these standards organizations to make a level playing field across trusted computing.

EE: So then, are there any final thoughts that you wanted to leave with our audience about the Trusted Computing Group and the importance of security in the development process?

Rob Spiger: I think it'd be great to point out that we have a lot of white papers and guidance in addition to standards that are free and publicly available on the TCG website for people that are interested just to go and browse and take a look at. There's no fee to access our standards and publications. So if you're interested, go check it out on our website and freely access that content, depending on what scenario you're looking at, it could be very useful.

Joerg Borchert: Yeah, the Trusted Computing Group is a really very dynamic organization. The process might take longer, but on the other hand, it provides the stability and the backbone for security evaluations, security implementations, and we are happy that you reached out to us.

Sponsored Recommendations


To join the conversation, and become an exclusive member of Electronic Design, create an account today!