Duality Collaborative AI

Train and deploy sensitive models on sensitive data—across institutions or clouds—without compromising privacy, IP, or regulatory compliance.

Why Duality Collaborative AI?

AI Collaboration Without Exposure.

Whether you’re working with partners, regulators, or internal teams, Duality enables you to collaborate on AI without risking data or revealing proprietary models. Run your workflows with partners securely—on any model, any data, anywhere.

Duality empowers multiple parties to collaborate

What Can You Do With It?

Partners

Duality Federated AI

Duality Federated AI enables organizations to train and analyze AI models collaboratively—without moving sensitive data and while protecting local computations results. Built on the foundations of NVIDIA FLARE and all major TEE cloud vendors, this solution combines federated learning with trusted execution environments to deliver privacy, compliance, and performance at scale. The platform supports data preprocessing, role-based governance, and automated encryption, enabling organizations to securely collaborate across geographies, sectors, and regulatory boundaries.

Whether it’s fraud detection, medical research, or model trials, organizations can use sensitive, distributed datasets without sacrificing control, accuracy, or privacy.

Duality Centralized AI

Duality Centralized AI enables secure training and deployment of AI models in trusted cloud environments—without compromising the privacy of data or models. All assets, including data and models, are encrypted locally and transmitted securely to a Trusted Execution Environment (TEE), where they are decrypted and processed in isolation.

This architecture ensures that sensitive data remains protected throughout its lifecycle. Even the cloud provider, platform administrators, or infrastructure owners cannot access the decrypted data or model. Results are re-encrypted before they leave the TEE and delivered only to authorized recipients—guaranteeing both confidentiality and trust.

Highlights

Secure Pathology Research at Dana-Farber

Cancer research often requires large-scale analysis of digital pathology images spread across hospitals and research centers. Privacy regulations make direct data sharing difficult, especially when dealing with protected health information (PHI). With Duality Collaborative AI, Dana-Farber Cancer Institute was able to collaborate with partners to train a cancer classification model from decentralized pathology images. Data never moved, and only encrypted model updates were transmitted and aggregated securely using Trusted Execution Environment. The result: a high-quality model trained across institutions, matching the performance of centralized training—without breaching patient confidentiality.

clinical trials solution

Fighting Financial Fraud Across Institutions

Detecting financial fraud requires collaboration across banks and financial institutions—but sharing sensitive transaction data poses legal and competitive risks. With Duality Collaborative AI, financial institutions can securely train machine learning models on distributed transaction data without centralizing or exposing it. Each institution trains locally, with only encrypted model updates shared and aggregated inside secure enclaves. This enables cross-institution model development for fraud detection while protecting client privacy and proprietary algorithms.

Platform Solutions

icon financial services

Financial Services

Enable privacy-preserving data collaborations across your entire financial ecosystem.

icon healthcare

Healthcare

Protect patient data across your healthcare network through privacy-preserving collaborations.

icon government

Government

Enable seamless, privacy-preserving data collaborations across government agencies.

icon marketing

Marketing

Transform your marketing strategies with data collaborations that respect privacy, driving digital advertising efforts with confidence and creativity.

Experience Secure Collaborative Computing

Maximize the value of sensitive, regulated, or confidential data.