Setup
AI Bridge Proxy runs inside the Coder control plane (coderd), requiring no separate compute to deploy or scale.
Once enabled, coderd runs the aibridgeproxyd in-memory and intercepts traffic to supported AI providers, forwarding it to AI Bridge.
Required:
- AI Bridge must be enabled and configured (requires a Premium license with the AI Governance Add-On). See AI Bridge Setup for further information.1. AI Bridge Proxy must be enabled using the server flag.
- A CA certificate must be configured for MITM interception.
- Clients must be configured to use the proxy and trust the CA certificate.
Warning
AI Bridge Proxy should only be accessible within a trusted network and must not be directly exposed to the public internet. See Security Considerations for details.
Proxy Configuration
AI Bridge Proxy is disabled by default. To enable it, set the following configuration options:
CODER_AIBRIDGE_ENABLED=true \
CODER_AIBRIDGE_PROXY_ENABLED=true \
CODER_AIBRIDGE_PROXY_CERT_FILE=/path/to/ca.crt \
CODER_AIBRIDGE_PROXY_KEY_FILE=/path/to/ca.key \
coder server
# or via CLI flags:
coder server \
--aibridge-enabled=true \
--aibridge-proxy-enabled=true \
--aibridge-proxy-cert-file=/path/to/ca.crt \
--aibridge-proxy-key-file=/path/to/ca.key
Both the certificate and private key are required for AI Bridge Proxy to start. See CA Certificate for how to generate and obtain these files.
The AI Bridge Proxy only intercepts and forwards traffic to AI Bridge for the supported AI provider domains:
- Anthropic:
api.anthropic.com - OpenAI:
api.openai.com - GitHub Copilot:
api.individual.githubcopilot.com
All other traffic is tunneled through without decryption.
For additional configuration options, see the Coder server configuration.
Security Considerations
Warning
AI Bridge Proxy uses HTTP for incoming connections from AI tools. If exposed to untrusted networks, Coder credentials may be intercepted and internal services may become accessible to attackers. The AI Bridge Proxy must not be exposed to the public internet.
AI Bridge Proxy is an HTTP proxy, which means:
- Credentials are sent in plain text: In order to authenticate with Coder via AI Bridge, AI tools send the Coder session token in the proxy credentials over HTTP.
The proxy then relays these credentials to AI Bridge for authentication. If the proxy is exposed to an untrusted network, these credentials could be intercepted.
- Open tunnel access: Requests to non-allowlisted domains are tunneled through the proxy without restriction.
An attacker with access to the proxy could use it to reach internal services or route traffic through the infrastructure.
Note: These limitations are known and additional security hardening is planned on the roadmap.
These risks apply only to the connection phase between AI tools and the proxy. Once connected:
- MITM mode: A TLS connection is established between the AI tool and the proxy (using the configured CA certificate), then traffic is forwarded securely to AI Bridge.
- Tunnel mode: A TLS connection is established directly between the AI tool and the destination, passing through the proxy without decryption.
Deployment options
To address these risks, the recommended deployment options are:
- Internal network only: Deploy the proxy so that only AI tools within the same internal network can access it.
This is the simplest and safest approach when AI tools run inside Coder workspaces on the same network as the Coder deployment.
- TLS-terminating load balancer: Place a TLS-terminating load balancer in front of the proxy so AI tools connect to the load balancer over HTTPS.
The load balancer terminates TLS and forwards requests to the proxy over HTTP. This protects credentials in transit, but you must still restrict access to allowed source IPs to prevent unauthorized use.
CA Certificate
AI Bridge Proxy uses a CA (Certificate Authority) certificate to perform MITM interception of HTTPS traffic. When AI tools connect to AI provider domains through the proxy, the proxy presents a certificate signed by this CA. AI tools must trust this CA certificate, otherwise, the connection will fail.
Self-signed certificate
Use a self-signed certificate when your organization doesn't have an internal CA, or when you want a dedicated CA specifically for AI Bridge Proxy.
Generate a CA certificate specifically for AI Bridge Proxy:
- Generate a private key:
openssl genrsa -out ca.key 4096
chmod 400 ca.key
- Create a self-signed CA certificate (valid for 10 years):
openssl req -new -x509 -days 3650 \
-key ca.key \
-out ca.crt \
-subj "/CN=AI Bridge Proxy CA"
Configure AI Bridge Proxy with both files:
CODER_AIBRIDGE_PROXY_CERT_FILE=/path/to/ca.crt
CODER_AIBRIDGE_PROXY_KEY_FILE=/path/to/ca.key
Corporate CA certificate
If your organization has an internal CA that clients already trust, you can have it issue an intermediate CA certificate for AI Bridge Proxy. This simplifies deployment since AI tools that already trust your organization's root CA will automatically trust certificates signed by the intermediate.
Your organization's CA issues a certificate and private key pair for the proxy. Configure the proxy with both files:
CODER_AIBRIDGE_PROXY_CERT_FILE=/path/to/intermediate-ca.crt
CODER_AIBRIDGE_PROXY_KEY_FILE=/path/to/intermediate-ca.key
Securing the private key
Warning
The CA private key is used to sign certificates for MITM interception. Store it securely and restrict access. If compromised, an attacker could intercept traffic from any client that trusts the CA certificate.
Best practices:
- Restrict file permissions so only the Coder process can read the key.
- Use a secrets manager to store the key where possible.
Distributing the certificate
AI tools need to trust the CA certificate before connecting through the proxy.
For self-signed certificates, AI tools must be configured to trust the CA certificate. The certificate (without the private key) is available at:
https://<coder-url>/api/v2/aibridge/proxy/ca-cert.pem
For corporate CA certificates, if the systems where AI tools run already trust your organization's root CA, and the intermediate certificate chains correctly to that root, no additional certificate distribution is needed. Otherwise, AI tools must be configured to trust the intermediate CA certificate from the endpoint above.
How you configure AI tools to trust the certificate depends on the tool and operating system. See Client Configuration for details.
Upstream proxy
If your organization requires all outbound traffic to pass through a corporate proxy, you can configure AI Bridge Proxy to chain requests to an upstream proxy.
Note
AI Bridge Proxy must be the first proxy in the chain. AI tools must be configured to connect directly to AI Bridge Proxy, which then forwards tunneled traffic to the upstream proxy.
How it works
Tunneled requests (non-allowlisted domains) are forwarded to the upstream proxy configured via CODER_AIBRIDGE_PROXY_UPSTREAM.
MITM'd requests (AI provider domains) are forwarded to AI Bridge, which then communicates with AI providers. To ensure AI Bridge also routes requests through the upstream proxy, make sure to configure the proxy settings for the Coder server process.
Configuration
Configure the upstream proxy URL:
CODER_AIBRIDGE_PROXY_UPSTREAM=http://<corporate-proxy-url>:8080
For HTTPS upstream proxies, if the upstream proxy uses a certificate not trusted by the system, provide the CA certificate:
CODER_AIBRIDGE_PROXY_UPSTREAM=https://<corporate-proxy-url>:8080
CODER_AIBRIDGE_PROXY_UPSTREAM_CA=/path/to/corporate-ca.crt
If the system already trusts the upstream proxy's CA certificate, CODER_AIBRIDGE_PROXY_UPSTREAM_CA is not required.
Client Configuration
To use AI Bridge Proxy, AI tools must be configured to:
- Route traffic through the proxy
- Trust the proxy's CA certificate
Configuring the proxy
The preferred approach is to configure the proxy directly in the AI tool's settings, as this avoids routing unnecessary traffic through the proxy. Consult the tool's documentation for specific instructions.
Alternatively, most tools support the standard proxy environment variables, though this is not guaranteed for all tools:
export HTTP_PROXY="http://coder:${CODER_SESSION_TOKEN}@<proxy-host>:8888"
export HTTPS_PROXY="http://coder:${CODER_SESSION_TOKEN}@<proxy-host>:8888"
HTTP_PROXY: Used for requests tohttp://URLsHTTPS_PROXY: Used for requests tohttps://URLs (this is the one used for AI provider domains)
In order for AI tools that communicate with AI Bridge Proxy to authenticate with Coder via AI Bridge, the Coder session token needs to be passed in the proxy credentials as the password field.
Trusting the CA certificate
The preferred approach is to configure the CA certificate directly in the AI tool's settings, as this limits the scope of the trusted certificate to that specific application. Consult the tool's documentation for specific instructions.
Note
If using a corporate CA certificate and the system already trusts your organization's root CA, no additional certificate configuration is required.
Download the certificate:
curl -o coder-aibridge-proxy-ca.pem \
-H "Coder-Session-Token: ${CODER_SESSION_TOKEN}" \
https://<coder-url>/api/v2/aibridge/proxy/ca-cert.pem
Replace <coder-url> with your Coder deployment URL.
Environment variables
Different AI tools use different runtimes, each with their own environment variable for CA certificates:
| Environment Variable | Runtime |
|---|---|
NODE_EXTRA_CA_CERTS | Node.js |
SSL_CERT_FILE | OpenSSL, Python, curl |
REQUESTS_CA_BUNDLE | Python requests library |
CURL_CA_BUNDLE | curl |
Set the environment variables associated with the AI tool's runtime. If you're unsure which runtime the tool uses, or if you use multiple AI tools, the simplest approach is to set all of them:
export NODE_EXTRA_CA_CERTS="/path/to/coder-aibridge-proxy-ca.pem"
export SSL_CERT_FILE="/path/to/coder-aibridge-proxy-ca.pem"
export REQUESTS_CA_BUNDLE="/path/to/coder-aibridge-proxy-ca.pem"
export CURL_CA_BUNDLE="/path/to/coder-aibridge-proxy-ca.pem"
System trust store
When tool-specific or environment variable configuration is not possible, you can add the certificate to the system trust store. This makes the certificate trusted by all applications on the system.
On Linux:
sudo cp coder-aibridge-proxy-ca.pem /usr/local/share/ca-certificates/
sudo update-ca-certificates
For other operating systems, refer to the system's documentation for instructions on adding trusted certificates.
Coder workspaces
For AI tools running inside Coder workspaces, template administrators can pre-configure the proxy settings and CA certificate in the workspace template. This provides a seamless experience where users don't need to configure anything manually.
For tool-specific configuration details, check the client compatibility table for clients that require proxy-based integration.


