Hey, kudos!
You don't run arbitrary scripts either!

My apologies for the JS on this page…
it's prettify.js for syntax highlighting
in code blocks. I've added one line of
CSS for you; the rest of this site
should work fine.

      ♥Ⓐ isis

code.

Using Intel SGX Enclaves in NFC-enabled TPM-based Local Attestation

Previously, Matthew Garrett and I came up with an new idea for a method of local attestation. Local attestation here means: authenticating the computer that the user possesses a valid hardware token and authenticating to the user that the computer is executing the intended code, and that said code has not been tampered with. The idea is to use some NFC-enabled “smart” wearable device, something trivially hideable on (or inside¹) one’s person in order to authenticate to the TPM, which then validates that the next stage of code to be executed, e.g. usually the kernel (ring 0) or the hypervisor (ring “-1”), has verifiable integrity. Matthew has a great 32c3 talk on TPM-based local attestation, and even breifly, towards the end of the video, mentions the NFC ideas.

As an example use case, this would allow journalists² greater safety when crossing borders. Your laptop got taken away by the TLA at a border? Not such a problem; it simply doesn’t boot without you present. The TLA took your laptop into the back room to try to install some malware on it? No worries, because your laptop will refuse to boot the next time you try to do so (or it could signal in some other way that the system was compromised… however, refusing to decrypt the user’s harddrive is probably a bare minimum safety requirement, and refusing to boot at all is probably the safest).

However, all of this places a great deal of trust in both the TPM device and its manufacturer…

Despite Joanna Rutkowska’s concerns over untrusted user input/output, it would be interesting to see a system, built upon the above local attestation method, which uses an Intel SGX enclave (see the Intel Instruction Set Extensions Programming Reference for architectural details) to execute code whose integrity has been previously verified through two-factor authenticated TPM local attestation. This doesn’t require user I/O, and it doesn’t require anything to be displayed to the user. What it would provide, however, is a way for the code whose integrity is verified by the TPM to remain safely isolated from:

  • the BIOS, or tampering thereof,
  • System Management Mode (SMM), and,
  • (possibly) Intel Active Management Technology (AMT) — modulo Intel’s SGX implementation (and how much you trust said implementation to protect you from their AMT backdoor).

This protects against tampering of the BIOS itself, which, otherwise, could possibly subvert the initialisation of the TPM hardware and cause the integrity verification checks to falsely pass. Without SGX, SMM (ring “-2”) would have the capability to emulate and/or forward calls to and from the TPM device, and as such any SMM-based attack would completely subvert the local attestation.

Additionally, in my and Matthew’s NFC-TPM-based local attestation method, the cryptographic code for verification would need to be partially executed on the “smart” device. In Matthew’s 32c3 talk, the laptop uses a pre-shared key, stored in the TPM, to generate a Time-based One-Time Password (TOTP), which is very simple scheme used for two-factor authentication, and which essentially does:

TOTP ← HMAC(SharedKey||TimeInterval)

The output then is presented as a QRcode on the screen, which the user scans into the external device (a smart phone, in this case) which also runs TOTP to check that the TPM verification was successful.

Smart phones being security nightmares, it’s nice in my opinion to avoid them altogether. (And certainly to never rely on them in any trusted computing scheme!) Alternatively, one could also imagine some smart² jewelry³ such as a necklace or bracelet (cufflinks could also be pretty badass) with an embedded NFC-capable smartcard. Unfortunately, smartcard means you’re likely running in a JVM… which — my livid hatred for the Java programming language aside — hasn’t exactly had the best track record in terms of security. This also unfortunately probably restricts us to using only the set of cryptographic primitives which are PKCS#11 compatible, in order to facilitate communication between the smartcard and the TSS. One interesting area for further research would be a way to remove this requirement, i.e. use something other than a smartcard, and/or devise a scheme for moving execution (on either side) into an SGX enclave as well.

Moving forward towards more secure computing platforms, the most realistic candidate I can currently imagine would be comprised by a hardware-modified Thinkpad which uses the above local attestation scheme to verify the integrity of QubesOS’s security-critical code and the integrity of a Coreboot (the latter of which could also be verified from within QubesOS, e.g. via Joanna’s Anti-Evil Maid system, however only post boot and unsure if this would be capatible with using some the extra protections against malicious SMM code, like verifying the RAMstage upon wake from S3, which Coreboot can provide). Providing these integrity checks pass, and the user possesses a valid hardware-authentication token, Coreboot can then be executed (without needed to trust SMM) and further initialise Qube’s Xen hypervisor, which then executes dom0 and so on.


¹ Matthew’s rather grotesque aside was, “Well… you want to limit the number of parts they have to cut off of you…”
² Well… anyone actually. But everyone likes to pretend journos are special and the rest of us are second-class citizens, right?
³ Yes, I hate that word too. Shut up and mark your bingo card already.
⁴ I’d just like to take this opportunity to coin the term SmartSchmuck.


<<< Teufelsberg FBI Harassment >>>

blogroll

social