Setup
AI Bridge runs inside the Coder control plane (coderd), requiring no separate compute to deploy or scale. Once enabled, coderd runs the aibridged in-memory and brokers traffic to your configured AI providers on behalf of authenticated users.
Required:
- A premium licensed Coder deployment
- Feature must be enabled using the server flag
- One or more providers API key(s) must be configured
Activation
You will need to enable AI Bridge explicitly:
CODER_AIBRIDGE_ENABLED=true coder server
# or
coder server --aibridge-enabled=true
Configure Providers
AI Bridge proxies requests to upstream LLM APIs. Configure at least one provider before exposing AI Bridge to end users.
Set the following when routing OpenAI-compatible traffic through AI Bridge:
CODER_AIBRIDGE_OPENAI_KEYor--aibridge-openai-keyCODER_AIBRIDGE_OPENAI_BASE_URLor--aibridge-openai-base-url
The default base URL (https://api.openai.com/v1/) works for the native OpenAI service. Point the base URL at your preferred OpenAI-compatible endpoint (for example, a hosted proxy or LiteLLM deployment) when needed.
Note
See the Supported APIs section below for precise endpoint coverage and interception behavior.

