Abstract

Many applications are built upon private algorithms, and executing them in untrusted, remote environments poses confidentiality issues. To some extent, these problems can be addressed by ensuring the use of secure hardware in the execution environment; however, an insecure software-stack can only provide limited algorithm secrecy.This paper aims to address this problem, by exploring the components of the Trusted Computing Base (TCB) in hardware-supported enclaves. First, we provide a taxonomy and give an extensive understanding of trade-offs during secure enclave development. Next, we present a case study on existing secret-code execution frameworks; which have bad TCB design due to processing secrets with commodity software in enclaves. This increased attack surface introduces additional footprints on memory that breaks the confidentiality guarantees; as a result, the private algorithms are leaked. Finally, we propose an alternative approach for remote secret-code execution of private algorithms. Our solution removes the potentially untrusted commodity software from the TCB and provides a minimal loader for secret-code execution. Based on our new enclave development paradigm, we demonstrate three industrial templates for cloud applications: ① computational power as a service, ② algorithm querying as a service, and ③ data querying as a service.

Highlights

  • Managing trust in remote execution environments is an enduring challenge

  • The novelty of our work is as follows: 1 we consider mutually distrustful entities (HO, Data Owner (DO), Algorithm Owner (AO)) with conflicting interests in the cloud, 2 we differentiate the private algorithms and the private data, 3 we show the bad practices on use of Trusted Execution Environment (TEE) in the cloud, 4 we create a taxonomy for secure execution of private algorithms in untrusted remote environments, 5 we provide practical insights to enclave development, 6 we perform a security analysis on existing dynamic code loaders with interpreter enclaves, and 7 we evaluate our execution model in three adversarial settings in the cloud

  • We show two different approaches in Section 2.3: 1 the Hardware Owner (HO) develops an enclave that maintains the code-secrecy after its release, and 2 the AO develops an enclave that ensures the code-secrecy before its release

Read more

Summary

Introduction

Managing trust in remote execution environments is an enduring challenge. This is due to the fact that privacysensitive data and private algorithms remain unprotected in remote computers. There are a number of reasons why owners of private algorithms may need to run their algorithms in an untrusted environment This may occur when the Algorithm Owner (AO) requires the capabilities of a Hardware Owner (HO), for example, to gain larger computing power in the cloud. Intel’s SGX is a trusted hardware solution [2, 23, 39], which provides a novel development model for enclave binaries, as well as hardware-maintained (ring -3) integrity guarantees for computations at user-level (ring 3). SGX’s enclave development model helps by securing sensitive parts of user-level applications [39] Enclaves can prove their identity to verifier entities who require evidence before proceeding to execution. SGX is a good TEE solution for application developers

Objectives
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call