Defending Against Prompt Injection: Insights from 300K attacks in 30 days.

Read the report

Back to Blog

permissions

Permissions don’t persist in AI apps 
and that’s a big problem
John Gamble
John Gamble
Permissions don’t persist in AI apps and that’s a big problem

I’ve spent many weeks on the road this past quarter speaking to teams all over the country about their AI product initiatives and I am struck by a particular challenge I’ve heard repeatedly expressed: when developing retrieval-augmented generation (R...

Security Guardrails for AI Applications.
SOC 2 Type I icon

SOC 2 Type 2

HIPAA compliance icon

HIPAA Compliant

ISO/IEC 27001 compliance icon

ISO/IEC 27001

ISO/IEC 27701 compliance icon

ISO/IEC 27701

Platform

AI Security Platform

Use Cases

Employee AI usage
Homegrown AI Apps

Products

AI Detection & Response
AI Application Guardrails
AI Red Teaming

AI Product Security Workshop

Pangea Labs

AI Security Research
Prompt Injection Taxonomy
Prompt Injection Challenge

Explore

Blog
Startup Program
Technologies

Connect

News & Events

Documentation

Documentation
Getting Started Guide
Admin Guide
Tutorials
Postman Collections
API Reference
SDK Reference

Company

About Us
Careers

Service Status

Trust Center

© 2025 Pangea. All rights reserved.

636 Ramona St, Palo Alto, CA 94301

PrivacyTerms of UseYour Privacy ChoicesContact us
GitHub iconLinkedIn iconFacebook icon

Outsmart our AI. Play now

Play our AI Escape Room Challenge to test your prompt injection skills.

Register Now