BYOK
Bring Your Own Keys

You bring your LLM API keys. The platform orchestrates your AI agents. Your data flows directly from your agents to your LLM provider -- never through, never stored by, never visible to the platform vendor.

Architecture Data Sovereignty Zero Trust Privacy

What is BYOK?

Most AI platforms work like this: you send your data to the platform, the platform sends it to an LLM, the LLM responds, and the platform stores both the input and output. Your data lives on someone else's servers, processed by someone else's API keys, visible to someone else's engineers.

BYOK inverts this. You provide your own API keys for Anthropic, OpenAI, Google, Mistral, or any other LLM provider. The platform never sees the content of your prompts or responses. It orchestrates the agent -- deciding which tool to call, which agent to hand off to, what to do next -- but the actual LLM communication goes directly from your agent to your LLM provider, authenticated with your keys.

Data flow comparison

Traditional AI Platform vs BYOK Architecture

Traditional Platform

Your Data --> Vendor Cloud
Data sent to vendor's servers
Vendor Cloud --> Stored
Prompts + responses stored on vendor infra
Vendor Keys --> LLM Call
Vendor's API keys, vendor's usage terms
vs

BYOK Architecture

Your Data --> Your LLM Keys
Data goes directly to your LLM provider
Platform --> Orchestrates Only
Platform decides what to do, never sees content
Your Keys --> Your Terms
Your API agreement, your data processing terms

Why BYOK matters for enterprise

BYOK is not optional

Most AI platforms offer BYOK as an upgrade or enterprise add-on. This creates a dangerous default: customers on lower tiers have their data flowing through the vendor's keys, often without realizing it.

The question every CISO should ask: "What happens to our data on your cheapest plan?"

How MeetLoyd implements BYOK

BYOK is mandatory at every tier on MeetLoyd -- Starter ($399/mo), Growth ($1,499/mo), and Enterprise. There is no plan where MeetLoyd processes your data with our keys. This is an architectural decision, not a pricing tier feature.

  • any OpenAI-compatible LLM plus self-hosted vLLM supported -- Anthropic (Claude), OpenAI (GPT), Google (Gemini), Mistral, Groq, plus self-hosted vLLM. Bring keys for one or all.
  • Key encryption -- Your API keys are encrypted with AES-256-GCM and stored in MeetLoyd's vault. They are decrypted only at the moment of the LLM call, in memory, never logged.
  • LLM Gateway -- Every LLM call passes through the gateway for budget checks, prompt injection detection, PII redaction, and content moderation -- but the gateway operates on metadata, not on the content of your prompts.
  • No token billing -- Since you use your own keys, MeetLoyd charges for platform access (agents + seats), not token consumption. You pay Anthropic/OpenAI directly.
  • Memory Pointer Architecture -- Agent memory stores pointers and summaries, not raw conversation content. Even MeetLoyd's database doesn't contain your full LLM interactions.

See Trust & Security -->

BYOK + governance

BYOK alone isn't enough. Without governance, you're just replacing one risk (vendor data exposure) with another (Shadow AI with your own keys). That's why BYOK must be paired with:

Related terms

Your keys. Your data.
Our governance. That's MeetLoyd.

Trust & Security Back to Glossary