Skip to main content

AI Firewall via Docker

Prerequisites

Before installing the Levo AI Firewall via Docker, ensure you have:

  • Met the system requirements listed in Install AI Firewall
  • Docker Engine 20.10.0 or higher
  • Admin privileges on the Docker host
  • Your Authorization Key and Organisation ID from your Levo account

1. Pull the Image

docker pull levoai/ai-firewall:latest

2. Create a Configuration File

The AI Firewall requires a vigil.yaml config file that specifies the upstream LLM endpoint it should proxy to. Create one on your host:

# vigil.yaml
server:
listen_addr: "0.0.0.0:8080"
workers: 4
timeout_secs: 30

upstream:
# Address of the upstream LLM provider (host:port)
address: "api.openai.com:443"
tls: true
sni: "api.openai.com"

detection:
stages:
fast_filters: true
classifier: false
semantic: false
policy: true
thresholds:
block: 0.85
flag: 0.5
patterns_dir: "./data/patterns"
models_dir: "./data/models"
max_prompt_length: 100000

observability:
log_level: "info"
json_logs: false

alerting:
enabled: true
log_alerts: true
min_severity: "low"

Replace api.openai.com with your upstream LLM provider address. For example, for Azure OpenAI set address: "your-resource.openai.azure.com:443" and sni: "your-resource.openai.azure.com".

3. Run the AI Firewall

docker run -d \
--name levoai-aifirewall \
--restart unless-stopped \
-p 8080:8080 \
-e LEVOAI_BASE_URL="https://api.levo.ai" \
-e LEVOAI_AUTH_KEY="<Authorization Key>" \
-e LEVOAI_ORG_ID="<Organisation ID>" \
-e LEVOAI_ENV_NAME="<Environment Name>" \
-e LEVOAI_SATELLITE_URL="<Satellite URL>" \
-v /path/to/vigil.yaml:/app/config/vigil.yaml:ro \
levoai/ai-firewall:latest

Replace /path/to/vigil.yaml with the absolute path to the config file you created in step 2. For LEVOAI_SATELLITE_URL, use https://satellite.levo.ai (Levo-hosted) or your own on-premise satellite address. For accounts on the India domain, set LEVOAI_BASE_URL to https://api.india-1.levo.ai.

4. Verify the Installation

Check that the container is running:

docker ps -f name=levoai-aifirewall

If the AI Firewall is healthy, you should see output similar to the following:

CONTAINER ID   IMAGE                        COMMAND        CREATED         STATUS         PORTS                    NAMES
b7e2d4c1a9f3 levoai/ai-firewall:latest "vigil --..." 1 minute ago Up 1 minute 0.0.0.0:8080->8080/tcp levoai-aifirewall

Verify the health endpoint:

curl http://localhost:8080/health

Expected response:

{"status":"healthy","service":"vigil"}

Please contact support@levo.ai if you notice health or connectivity errors.

5. Configure the AI Firewall in the Levo Dashboard

The AI Firewall polls the Levo platform every 60 seconds and automatically applies the latest configuration. Routing rules and guardrail policies are managed from the dashboard.

  • Login to Levo.ai.
  • Navigate to AI FirewallsConfiguration.
  • Paste your configuration YAML and click Save.

Refer to step 4 in AI Firewall on Kubernetes for a configuration example and YAML reference — the format is identical for Docker and Kubernetes deployments.

6. Point Your Application at the Firewall

Update your application to send LLM requests through the firewall instead of directly to the LLM provider. Replace the LLM provider's base URL with http://<Docker-host-IP>:8080/v1.

For example, if you are using the OpenAI SDK:

from openai import OpenAI

client = OpenAI(
base_url="http://<Docker-host-IP>:8080/v1",
api_key="<Your OpenAI API Key>",
)

AI Firewall Lifecycle Management

Upgrade AI Firewall

Pull the latest image:

docker pull levoai/ai-firewall:latest

Stop and remove the existing container, then re-run the docker run command from Step 3:

docker stop levoai-aifirewall && docker rm levoai-aifirewall

Stop AI Firewall

docker stop levoai-aifirewall

Uninstall AI Firewall

docker stop levoai-aifirewall && docker rm levoai-aifirewall

Troubleshooting

Container Fails to Start

Check the container logs for errors:

docker logs levoai-aifirewall

Common causes:

  • Missing or invalid config: Ensure the config file path is correct and the upstream.address is reachable from the Docker host.
  • Invalid credentials: Verify your LEVOAI_AUTH_KEY and LEVOAI_ORG_ID are correct.
  • Network connectivity: Confirm the host can reach your upstream LLM provider and api.levo.ai on port 443.

Enable Debug Logging

Stop and re-run the container with RUST_LOG=debug:

docker stop levoai-aifirewall && docker rm levoai-aifirewall

docker run -d \
--name levoai-aifirewall \
--restart unless-stopped \
-p 8080:8080 \
-e LEVOAI_BASE_URL="https://api.levo.ai" \
-e LEVOAI_AUTH_KEY="<Authorization Key>" \
-e LEVOAI_ORG_ID="<Organisation ID>" \
-e LEVOAI_ENV_NAME="<Environment Name>" \
-e LEVOAI_SATELLITE_URL="<Satellite URL>" \
-e RUST_LOG="debug" \
-v /path/to/vigil.yaml:/app/config/vigil.yaml:ro \
levoai/ai-firewall:latest

Need Help?

For further assistance, please reach out to support@levo.ai.

Was this page helpful?