Skip to main content

Edge overview

Edge represents our hybrid deployment model, bridging the gap between fully hosted SaaS and private cloud deployments (supporting AWS, Azure, and GCP). This architecture allows you to run specific Pangea services within your own infrastructure, while managing them through Pangea's cloud platform.

By deploying certain data processing services in your environment, Edge ensures your sensitive data is processed locally and never leaves your infrastructure boundary. Meanwhile, you can configure and monitor these services through Pangea's cloud console, receiving updates and maintaining operational visibility without needing to manage the entire Pangea service stack.

note

Edge currently supports the following services:

  • Redact
  • AI Guard (beta)

Understanding Edge

Edge implements a split architecture that separates data processing from service management. All sensitive data processing occurs within your infrastructure boundary through containerized services, ensuring that raw data never leaves your control. Meanwhile, Pangea's cloud infrastructure handles service configuration, updates, and monitoring.

This separation allows you to focus on what's important to you:

  • Your sensitive data is processed by services running in your environment, while Pangea's control plane handles service configuration updates and collects usage metrics for monitoring.
  • You gain the ability to meet strict compliance requirements and data sovereignty needs while avoiding the operational overhead of managing the entire service stack.

Edge Deployment

Core components

The Edge architecture consists of two distinct elements:

  • Customer cloud components

    • This consists of customer-deployed services that run within your infrastructure boundary.
    • These are specific Pangea services (like Redact) deployed as container images in your environment.
    • These services process your sensitive data locally, ensuring raw data never leaves your control.
  • Pangea cloud components

    • Control Plane: Handles service configuration and token management.
    • Data Plane: Provides the core Pangea services through our cloud infrastructure.
    • Service Configuration: Management of service settings and parameters.
    • Metrics Collection: Gathering operational metrics for monitoring.

Operational flow

Here are the steps that your data will follow:

  1. Your applications interact directly with Pangea services deployed in your environment, sending requests for data processing.
  2. These services process all data locally within your infrastructure - raw data never leaves your environment.
  3. Results are returned directly to your applications through your internal network.
  4. The services maintain a connection to Pangea's control plane for two purposes:
    • Sending authentication status and usage metrics back to Pangea
    • Receiving configuration updates from Pangea

Deployment options

Edge supports two deployment approaches:

Single container deployment, which provides a straightforward path to implementing Edge. This approach works well for:

  • Development environments.
  • Proof-of-concept implementations.
  • Scenarios with moderate processing requirements.

Kubernetes deployment, which provides additional operational capabilities for production workloads:

  • Container orchestration for service management.
  • Built-in scaling capabilities.
  • High availability options.
  • Standard Kubernetes monitoring and logging.

Next steps

With an understanding of Edge's architecture, you're ready to begin implementation. Choose your deployment platform and follow our guides to get started:

Was this article helpful?

Contact us