Setup

AI Gateway runs inside the Coder control plane (coderd), requiring no separate compute to deploy or scale. Once enabled, coderd runs the aibridged in-memory and brokers traffic to your configured AI providers on behalf of authenticated users.

Required:

  1. A Premium license with the AI Governance Add-On.
  2. Feature must be enabled using the server flag
  3. One or more providers API key(s) must be configured

Activation

You will need to enable AI Gateway explicitly:

export CODER_AIBRIDGE_ENABLED=true coder server # or coder server --aibridge-enabled=true

Configure Providers

AI Gateway proxies requests to upstream LLM APIs. Configure at least one provider before exposing AI Gateway to end users.

Set the following when routing OpenAI-compatible traffic through AI Gateway:

  • CODER_AIBRIDGE_OPENAI_KEY or --aibridge-openai-key
  • CODER_AIBRIDGE_OPENAI_BASE_URL or --aibridge-openai-base-url

The default base URL (https://api.openai.com/v1/) works for the native OpenAI service. Point the base URL at your preferred OpenAI-compatible endpoint (for example, a hosted proxy or LiteLLM deployment) when needed.

If you'd like to create an OpenAI key with minimal privileges, this is the minimum required set:

List Models scope should be set to "Read", Model Capabilities set to "Request"

Note

See the Supported APIs section below for precise endpoint coverage and interception behavior.

Data Retention

AI Gateway records prompts, token usage, tool invocations, and model reasoning for auditing and monitoring purposes. By default, this data is retained for 60 days.

Configure retention using --aibridge-retention or CODER_AIBRIDGE_RETENTION:

coder server --aibridge-retention=90d

Or in YAML:

aibridge: retention: 90d

Set to 0 to retain data indefinitely.

For duration formats, how retention works, and best practices, see the Data Retention documentation.

Structured Logging

AI Gateway can emit structured logs for every interception record, making it straightforward to export data to external SIEM or observability platforms.

Enable with --aibridge-structured-logging or CODER_AIBRIDGE_STRUCTURED_LOGGING:

coder server --aibridge-structured-logging=true

Or in YAML:

aibridge: structured_logging: true

These logs are written to the same output stream as all other coderd logs, using the format configured by --log-human (default, writes to stderr) or --log-json. For machine ingestion, set --log-json to a file path or /dev/stderr so that records are emitted as JSON.

Filter for AI Gateway records in your logging pipeline by matching on the "interception log" message. Each log line includes a record_type field that indicates the kind of event captured:

record_typeDescriptionKey fields
interception_startA new intercepted request begins.interception_id, initiator_id, provider, model, client, started_at
interception_endAn intercepted request completes.interception_id, ended_at
token_usageToken consumption for a response.interception_id, input_tokens, output_tokens, created_at
prompt_usageThe last user prompt in a request.interception_id, prompt, created_at
tool_usageA tool/function call made by the model.interception_id, tool, input, server_url, injected, created_at
model_thoughtModel reasoning or thinking content.interception_id, content, created_at