Back to Blog Lobby

Secure AI Development & Training via Confidential Computing and TEEs

A shield on a circuit board represents hardware-based security mixed with software based privacy to secure AI development and training.

Confidential Computing: Security, Privacy, and Governance in AI Development & Training

Advanced data models like machine learning, generative AI, and all the others that fall under the generalized term “AI” promise massive leaps for good – from accelerated healthcare research to business growth to improved crime fighting. But as with any tool, these models used improperly can pose great risks. In this case, the risk is in how data is used to train the models and how models are protected after ingesting that data. In October 2023, the Biden-Harris administration  issued an executive order to address “safe, secure, and trustworthy AI.” The supporting  fact sheet prioritizes the use and development of privacy enhancing technologies (PETs) to “protect Americans’ privacy.” 

Fortunately, Duality’s Secure Data Collaboration platform utilizes many software PETs, individually as well as in combination, to address growing privacy needs and concerns when querying, analyzing, or training models with sensitive information. Recently, Duality added a hardware PET–secure enclaves or trusted execution environments (TEEs)–to further streamline efforts to make the development and training of AI safe and secure. While TEEs have become easier to set up over time, today there’s still much work to be done by administrators when it comes to configuring user management controls, schema management, encryption key management, governance controls, and reporting. Just as Duality has done for all raw privacy technologies like fully homomorphic encryption (FHE), our platform provides all such TEE requirements out of the box. Our ready-to-use approach lessens the burden on administrators and developers by reducing the complexity and risk of using such environments. Teams can instead focus on doing the work, rather than figuring out how to use the environment securely.

The Privacy Powerhouse – PETs integrated with Trusted Execution Environments (TEE)

Duality Technologies is recognized for its expertise in software-based cryptographic data protection solutions for data in use, most notably, Fully Homomorphic Encryption (FHE). Building on this expertise, we proudly introduce the latest addition to our technology stack: Trusted Execution Environments (TEE), also referred to as Secured Enclaves.

Unlike a software-based data clean room (DCR), a TEE is a hardware-based solution for protecting sensitive workloads. TEEs provide added security down to the hardware processing computations. Essentially, it’s a server with an isolated and protected space within the central processing unit (CPU), allowing for the processing of data in a secure and confidential manner. TEEs rely on hardware-based security mechanisms to create these isolated spaces, shielding them from the rest of the system and external threats.

As a hardware-based solution, TEEs are provided by all the leading cloud vendors – Amazon, Google and Microsoft. While each has different characteristics, they all serve the same purpose. Cloud vendors provide TEEs as a raw technology that requires additional expertise to configure and manage Duality, on the other hand, utilizes TEEs as yet another PET in the technology stack for its secure data collaboration platform. Our platform provides users the ability to utilize TEEs with the additional benefits of built-in collaboration management, governance, encryption key management, and exploratory data analysis features, among others. More importantly, Duality users will be able to  combine TEEs with additional PETs such as Fully Homomorphic Encryption, Federated Learning (FL), and more.

Deployment Options: On-Premises and Cloud Agnostic Flexibility Meets Security & Privacy

Recognizing the diverse needs of our users, Duality Technologies offers deployment options that cater to both on-premises and cloud environments. Duality has incorporated TEE as part of its privacy enhancing technologies stack and currently supports AWS Nitro and Google confidential space. Azure TEE is in our short-term roadmap.

What do TEEs enable users to do?

TEE empowers analysts and data scientists to leverage familiar tools and execute code with minimal restrictions. What sets TEE apart is its hardware-based workload protection, ensuring that no one, including the cloud vendors, can access the data and workload running inside the TEE. It also provides robust security while enabling the execution of computations and analysis on various structured and unstructured data types.

Attestations & Encryption Key Management Made Easy

Within a TEE, critical operations such as cryptographic computations and data processing occur with a high degree of trust and integrity using an attestation mechanism. The attestation enables TEEs to ensure that sensitive data and operations are safeguarded from unauthorized access, even from privileged users (E.g., cloud vendors employees) or malicious software, making them a vital technology for applications that demand robust security, like confidential data processing, secure collaboration, and intellectual property protection.

Security is paramount when it comes to a TEE, that’s the whole point. The typical TEE workflow requires the engineering teams to configure the resources to enable attestation, a process that employs cryptographic and encryption keys to ensure that the executed code aligns with the predefined specifications. Unfortunately, managing these keys can be a complex and time consuming process.

Duality Technologies, however, simplifies the process by automating encryption key management. Users no longer need to worry about these intricate details. After configuring the necessary permissions within the cloud Key Management System (KMS) platform, analysts and data scientists can seamlessly integrate the TEE into their workflow, focusing on what matters most: data analysis.

When are TEEs useful?

Organizations that manage sensitive data or that need to protect their intellectual property should consider using TEEs as part of their workflows. A TEE can protect the data a company is  providing for use, as well as the IP/privacy of a model they’re training on someone else’s data. For instance, an insurance company may want to train its proprietary cost model on patient data from a healthcare provider – number of visits, types of illnesses, treatments, costs, etc. The healthcare provider may not feel comfortable with or simply cannot provide that data to the insurer but would be happy to run the model in their environment. However, the insurer does not want the provider to see the model nor can they accept the risk of their model being leaked. A TEE delivers  a space in which both sides gain the protections and comfort needed to collaborate. This is just one example, but the workflow is commonly necessary and viable for various use cases across finance, healthcare, insurance, legal, telco, IoT, and government agencies.

Use Case: A Medical Facility, Two Research Groups, Structured and Unstructured Data

To demonstrate the viability of the solution, Duality Technologies performed a proof of concept (POC) showcasing collaboration among three organizations: a medical center, a genetic research organization, and a pharma researcher. The pharma researcher aimed to perform research about the genetic factors associated with pneumonia. To do so, patients’ chest X-ray images must be analyzed for detecting pneumonia conditions. These findings must link to patients’ genetics and demographic information. The linked data is then analyzed to find the impact of genetic mutations on pneumonia susceptibility. All the above should be performed while both the patients’ X-ray images and the patient’s genetic information are kept private and secure.

(CAPTION: With Duality’s Secure Data Collaboration platform, three organizations joined together to analyze structured demographic and genetic data with unstructured pneumonia patient X-rays all while keeping the patients’ identities private and secure.)

To address this challenge, the Duality Technologies collaboration platform was leveraged on top of the capabilities provided by AWS Nitro Enclave. Using the Duality platform, the collaborating organizations set the needed governance and controls, as well as the attestation process over the key management system for secure collaboration. In the simulated POC, we utilized a model called Visual Transformer (ViT) from the Huggingface transformers library, which was fine-tuned to detect pneumonia conditions using public X-ray images.

By combining Duality’s solution with a TEE in a public cloud environment, we ensure that sensitive data, such as X-ray images and genetic information, remain encrypted throughout the entire process. No party outside the TEE had access to the decrypted data or the underlying model–not Duality, not the TEE provider (AWS). This approach addressed the privacy concerns associated with sharing sensitive information and allowed for secure and collaborative analysis not previously possible.

Reduced Complexity Combined with Increased Security

In conclusion, there is no one-size-fits-all approach to data privacy protection with collaboration: different use cases are best served by different technologies. In some cases, the best solution is a combination of various privacy enhancing technologies in addition to computation and governance controls. The Duality platform offers a flexible, best-of-breed approach by including all such PETs in our “privacy layer,” upon which computation and collaboration (governance) layers are fully integrated for a true, ready-to-use experience. Adding TEE as part of our stack enables the option to support any type of data for secure and confidential use and allows users to run sensitive and powerful workloads that were not previously possible.  Going forward, we’re excited to combine TEEs with other PETs (like FHE and Federated Learning), which we’ll announce later in 2024.

Interested in securing and improving how your company leverages AI? Contact us for a free consultation with our data experts.

Sign up for more knowledge and insights from our experts