Skip to main content

Edge Overview

The Edge deployment model balances control and convenience by keeping sensitive data within your environment while leveraging Pangea’s hosted control plane.

Edge represents our hybrid deployment model, bridging the gap between fully hosted Pangea SaaS and Private Cloud deployments (supporting AWS, Azure, and GCP). This architecture allows you to run specific Pangea services within your infrastructure, while managing them through Pangea's cloud platform.

By deploying certain data processing services locally, Edge ensures that sensitive data is processed within your infrastructure boundary, without leaving your environment. Meanwhile, you can configure and monitor these services through Pangea's cloud console, receiving updates and maintaining operational visibility - without managing the entire Pangea software stack.

note

Edge currently supports the following services:

  • Redact
  • AI Guard

How it works

  • Control plane - Service configuration and token management occur within Pangea's cloud environment.
  • Data processing within your cloud boundary - Sensitive data is processed entirely within your infrastructure using Pangea services deployed as container images.

Edge Deployment

Data flow:

  1. Applications send API requests to Pangea services deployed in your cloud environment.
  2. Services process requests locally and return results to the application.
  3. The control plane remains connected for updates, configuration, and metrics reporting.

Key considerations

  • Data locality - Sensitive data stays within your cloud environment, meeting data residency and control requirements.
  • Technical expertise - Requires moderate infrastructure knowledge. Your team must deploy and maintain Pangea services in your cloud environment using container images. This is a manageable task for teams with containerization and cloud experience.
  • Best for - Teams with moderate infrastructure expertise who need localized processing for compliance or latency reasons and are not concerned with configurations and metrics being handled outside of their cloud boundary.

Understanding Edge

Edge implements a split architecture that separates data processing from service management. All data processing occurs within your infrastructure boundary through containerized services, ensuring that your data never leaves your control. Meanwhile, Pangea's cloud infrastructure handles service configuration, updates, and monitoring.

This separation allows you to focus on what's important to you:

  • Your sensitive data is processed by services running in your environment, while Pangea's control plane handles service configuration updates and collects usage metrics for monitoring.
  • You gain the ability to meet strict compliance requirements and data sovereignty needs while avoiding the operational overhead of managing the entire service stack.

Core components

The Edge architecture consists of two distinct elements:

  • Customer cloud components

    • These are customer-deployed services that run within your infrastructure boundary.
    • Specific Pangea services (like AI Guard or Redact) are deployed as container images in your environment.
    • These services process sensitive data locally, ensuring that raw data never leaves your control.
  • Pangea cloud components

    • Control plane - Handles service configuration and token management.
    • Data plane - Provides the core Pangea services through our cloud infrastructure.
    • Service configuration - Manages service settings and parameters.
    • Metrics collection - Gathers operational metrics for monitoring.

Operational flow

Here are the steps that your data will follow:

  1. Your applications interact directly with Pangea services deployed in your environment, sending requests for data processing.
  2. These services process all data locally within your infrastructure - raw data never leaves your environment.
  3. Results are returned directly to your applications through your internal network.
  4. The services maintain a connection to Pangea's control plane for two purposes:
    • Sending authentication status and usage metrics back to Pangea
    • Receiving configuration updates from Pangea

Deployment options

Edge supports two deployment approaches:

Single container deployment, which provides a straightforward path to implementing Edge. This approach works well for:

  • Development environments.
  • Proof-of-concept implementations.
  • Scenarios with moderate processing requirements.

Kubernetes deployment, which provides additional operational capabilities for production workloads:

  • Container orchestration for service management.
  • Built-in scaling capabilities.
  • High availability options.
  • Standard Kubernetes monitoring and logging.

Next steps

With an understanding of Edge's architecture, you're ready to begin implementation. Choose your deployment platform and follow our guides to get started:

Was this article helpful?

Contact us