What is Confidential Computing?

Confidential computing is a privacy-preserving technology that protects data while it is being processed. Unlike traditional methods that focus on protecting data when it’s stored (at rest) or moving between systems (in transit), confidential computing secures data in use. That is, while it’s actively being analyzed or computed.

This is made possible through hardware-based Trusted Execution Environments (TEE). These are isolated areas within a computer’s processor that run code and process data in a protected environment. Even if the operating system or infrastructure is compromised, the data and the computation inside a TEE remain inaccessible to unauthorized access.

confidential computing concept

Why Confidential Computing Exists

Organizations today work with sensitive information, including customer data, intellectual property, proprietary algorithms, financial records, health data, and more. While encryption can protect this information when it’s stored or transferred, it often must be decrypted to perform any kind of processing. That moment of decryption can expose data to internal threats, cloud providers, or attackers exploiting system vulnerabilities.

Confidential computing fills that gap. It allows data to stay protected even when in active use by:

  • Keeping data and workloads isolated in the processor
  • Allowing only authorized code to access the data
  • Preventing external actors from viewing or interfering with the process

This method allows multiple parties to work together on sensitive datasets without revealing the raw data to each other or to the infrastructure provider, making it a crucial tool for data security and regulatory compliance.

How Confidential Computing Works

A Trusted Execution Environment is a secure enclave within a processor. TEEs operate separately from the rest of the system and only allow pre-approved code to run inside them.

The process works through several steps:

  1. Code Verification: The code to be run is verified through a process called attestation
  2. Environment Setup: Once verified, the TEE is established and locked down
  3. Secure Processing: Data sent into the TEE is decrypted only inside the protected space
  4. Isolated Computation: Computations are performed with no external access
  5. Protected Output: Results are encrypted before leaving the TEE

Attestation Explained

Attestation is the process of proving that a TEE is genuine and running the expected code. 

There are two types:

  • Local Attestation: Verification between enclaves on the same platform
  • Remote Attestation: Verification from external parties, crucial for cloud deployments

No one else, including the cloud operator, provider, or host system, can access what’s happening inside the TEE.

Benefits of Confidential Computing

Organizations adopt confidential computing for a variety of reasons, particularly when handling private, regulated, or proprietary data. Some of the main benefits include:

  • Data Privacy: Sensitive data remains protected throughout its lifecycle, including while in use.
  • Collaborative Analysis: Multiple organizations can work together on joint analytics or machine learning tasks without exposing raw data to each other.
  • Data Protection Against Insider Threats: Even cloud administrators, infrastructure operators, or malicious admins cannot view or manipulate the data inside a TEE.
  • Compliance: Helps meet strict data privacy regulations such as GDPR, HIPAA, and others by reducing risk during processing.
  • Trust in Public Cloud: Makes it safer to process sensitive workloads on cloud infrastructure by reducing reliance on the cloud provider’s internal security.

Who Provides Confidential Computing Capabilities?

Confidential computing is supported by several major cloud environments and hardware vendors. These include:

  • Microsoft Azure Confidential Computing: Azure offers a range of confidential computing options using Intel SGX and AMD SEV-SNP hardware. These support both virtual machines and containerized workloads.
  • AWS Nitro Enclaves: Amazon Web Services uses a virtualization-based TEE architecture that provides isolated environments on EC2 instances. Nitro Enclaves support secure key handling, cryptographic operations, and attestation.
  • Google Cloud Confidential Computing: Google’s approach uses confidential VMs built on AMD SEV technology. It allows customers to protect data in use without changing their application code significantly.

Confidential computing is also supported through open standards and collaborations like the Confidential Computing Consortium, which promotes ecosystem growth and interoperability.

Considerations and Limitations

Confidential computing offers strong privacy protections, but there are some trade-offs. Deployment may require additional steps like attestation and key management, which can add complexity. In some cases, only specially written code can run inside a TEE, and availability may be limited across confidential cloud regions. Despite these factors, confidential computing remains a valuable option for secure data collaboration and privacy-focused workloads.

Common Use Cases for Confidential Computing

Confidential computing is useful across a wide range of industries. Common applications include:

  • Healthcare collaborations between hospitals and research institutions, protecting patient data
  • Financial services across banks or insurers, without sharing raw customer records
  • Supply chain insights in manufacturing, where vendors analyze performance data without exposing proprietary operations
  • AI model evaluations, where data buyers can try a model on their own data without downloading or reverse-engineering it

Because the computations happen inside TEEs, these collaborations do not expose the data or models outside the protected environment.

How Duality Uses Confidential Computing

At Duality, confidential computing plays an important role in our broader platform for data confidentiality and collaboration. TEEs are one of several Privacy-Enhancing Technologies (PETs) available to our users, alongside methods like fully homomorphic encryption (FHE), federated learning (FL), and differential privacy (DP).

We integrate confidential computing into the platform to allow organizations to work together on private or regulated data without revealing it to collaborators or infrastructure providers. This can include:

  • Running AI/ML models on private datasets without exposing the underlying data
  • Evaluating proprietary models while keeping model architecture and weights hidden
  • Conducting joint analytics between institutions without needing to move or centralize data
  • Protecting intermediate weights when running a federated learning flow

For example, this means a pharmaceutical company can run a drug trial across data from multiple hospitals, or a bank can detect fraud patterns across industry data, all without any party seeing the raw inputs from the others.

Our platform integrates directly with TEE-backed infrastructure such as AWS Nitro Enclaves, Google Confidential VMs. We also simplify what can be a complex setup process, handling tasks like attestation, key handling, and policy enforcement so users can focus on analysis and insight.