Compliance isn't something you do once. Every feature you ship, every vendor you add, every new market you enter is a potential gap between what your product actually does and what your legal docs say.
These five gaps are the ones that catch founders off guard. Usually when a potential enterprise customer asks for a DPA, a lawyer reviews their privacy policy, or a regulator comes knocking.
Gap 1: Your Privacy Policy Doesn't Match Your Actual Data Flows
This is the most common gap, and it compounds over time.
You write a privacy policy at launch. It says you collect email, name, and usage data. Then over the next 12 months:
- You add Mixpanel → now you're collecting detailed behavioral analytics
- You add Intercom → now support conversations contain customer data
- You add LogRocket → now you're session recording
- You add Segment → now customer data routes through a data pipeline
- You add an AI feature → now user inputs go to OpenAI
Your privacy policy still says "email, name, and usage data."
Why it matters: Regulators compare what your policy says to what your product actually does. The gap is a violation regardless of intent. Enterprise customers do the same during security reviews.
How to close it: Audit your vendor list quarterly. Every vendor that touches user data should appear in your privacy policy's subprocessor list. If you add a tool, update the policy before the tool goes live.
Gap 2: Missing Data Processing Agreements
DPAs are the contractual equivalent of "I acknowledge this vendor processes my users' data and here's how they've agreed to handle it."
GDPR requires DPAs with every vendor that processes EU personal data. Many other regulations are heading the same direction.
The problem: most founders never sign DPAs. They sign up for tools, click through terms, and move on. The DPA is a separate document that usually requires a separate step.
Common vendors where DPAs are missed:
- Google Analytics / GA4
- Stripe (has a DPA for EU data)
- Intercom (separate DPA process)
- OpenAI / Anthropic (data processing terms)
- Sentry / error tracking (logs often contain PII)
- AWS / GCP / Azure (have DPAs, need to be accepted)
How to close it: Go through your vendor list. For each one that handles user data, find their DPA, sign it, and save it. Most take 5 minutes. This is also what enterprise customers ask for when they do security reviews: "can you provide your subprocessor list and DPAs?"
Gap 3: You Don't Have a Process for Data Subject Requests
GDPR, CCPA, and most modern privacy laws give users specific rights:
- Right to access their data
- Right to delete their data
- Right to correct their data
- Right to export their data (GDPR portability)
- Right to opt out of automated decision-making
You're legally required to handle these requests within specific timeframes (30 days for GDPR, 45 days for CCPA).
Most SaaS products have no process for this. There's no way for users to request their data, no workflow for handling the request, and no guarantee you can actually fulfill it (because you don't have a full map of where user data lives).
How to close it:
- Add a data request email address to your privacy policy (start here, it's enough for early stage)
- Know where all user data lives in your system (this is your data map)
- Be able to export or delete a user's data on request
You don't need a fancy automated system at first. A documented manual process is compliant. What's not compliant is having no process at all.
Gap 4: You're Using AI Without Disclosing It
If your product uses an LLM API (OpenAI, Anthropic, Gemini, or any other) and your privacy policy or product UI doesn't mention it, you have a gap.
This matters because:
- GDPR: LLM providers are subprocessors that must be listed and covered by DPAs
- CCPA: If user input trains your models, you must disclose this before collecting and provide an opt-out
- EU AI Act: Chatbots and AI assistants must disclose they're AI (not human) to users
- FTC: AI-generated content in marketing or testimonials must be labeled
The disclosure doesn't need to be scary. "We use the Claude API to process your inputs and generate compliance recommendations" is sufficient in most cases.
What breaks your position:
- Adding a new LLM provider without updating your policy
- Starting to use user inputs for model training without disclosing it
- Deploying an AI chat feature without telling EU users they're talking to AI
Gap 5: Your Terms of Service Don't Cover What Your Product Does
Privacy policies get attention. Terms of service get forgotten.
Common ToS gaps that create real exposure:
Billing and cancellation: Vague language about refunds, downgrades, or what happens to data after cancellation. If a customer disputes a charge, "see our terms" only works if the terms are actually clear.
IP ownership of user content and outputs: If users create content in your product, or if your AI generates output based on their input, who owns it? If your ToS doesn't specify, the answer is unclear and disputed.
Export and deletion: Customers are increasingly asking: "If we cancel, can we export our data? When will it be deleted?" If your ToS doesn't answer this, it's a sales friction point and a potential legal problem.
Limitations on AI-generated content: If your product generates legal documents, compliance recommendations, or anything that could be relied upon, your ToS needs a clear disclaimer that outputs are not legal advice and require human review.
How to close it: Read your ToS against your current product. For every major feature (billing, AI outputs, user content, data retention on cancellation) ask: "Does the ToS cover this?" If not, add a clause.
The Pattern Behind All Five Gaps
Every one of these gaps has the same root cause: your product changed but your docs didn't.
This isn't negligence. It's just how SaaS works. You ship, you iterate, you add vendors and features. Your legal docs can't keep pace unless you have a system for updating them.
The fix isn't a one-time legal review. It's a process that ties product changes to compliance reviews. When you add a new vendor, someone asks "does this need a DPA?" When you ship an AI feature, someone asks "does our privacy policy cover this?"
That's the ongoing compliance problem. And it's the one nobody has solved yet.
Compliance Gap Checklist
Run through this quarterly, or whenever you ship a major feature or add a new vendor:
- Privacy policy matches actual vendor/subprocessor list
- DPAs in place with every vendor handling user data
- Data request process documented and functional
- AI/LLM usage disclosed in privacy policy
- AI chatbots disclose they're AI to users (EU)
- ToS covers billing, cancellation, IP, export, and AI output limitations
- New vendors reviewed for compliance requirements before going live