This guide shows how to set up a minimal deployment to use the TensorZero Gateway with the AWS Bedrock API.

Setup

For this minimal setup, you’ll need just two files in your project directory:
- config/
  - tensorzero.toml
- docker-compose.yml
You can also find the complete code for this example on GitHub.
For production deployments, see our Deployment Guide.

Configuration

Create a minimal configuration file that defines a model and a simple chat function:
config/tensorzero.toml
[models.claude_3_haiku_20240307]
routing = ["aws_bedrock"]

[models.claude_3_haiku_20240307.providers.aws_bedrock]
type = "aws_bedrock"
model_id = "anthropic.claude-3-haiku-20240307-v1:0"

[functions.my_function_name]
type = "chat"

[functions.my_function_name.variants.my_variant_name]
type = "chat_completion"
model = "claude_3_haiku_20240307"
See the list of available models on AWS Bedrock.
Many AWS Bedrock models are only available through cross-region inference profiles. For those models, the model_id requires special prefix (e.g. the us. prefix in us.anthropic.claude-3-7-sonnet-20250219-v1:0). See the AWS documentation on inference profiles.
See the Configuration Reference for optional fields (e.g. overriding the region).

Credentials

You must make sure that the gateway has the necessary permissions to access AWS Bedrock. The TensorZero Gateway will use the AWS SDK to retrieve the relevant credentials. The simplest way is to set the following environment variables before running the gateway:
AWS_ACCESS_KEY_ID=...
AWS_REGION=us-east-1
AWS_SECRET_ACCESS_KEY=...
Alternatively, you can use other authentication methods supported by the AWS SDK.

Deployment (Docker Compose)

Create a minimal Docker Compose configuration:
docker-compose.yml
# This is a simplified example for learning purposes. Do not use this in production.
# For production-ready deployments, see: https://www.tensorzero.com/docs/gateway/deployment

services:
  gateway:
    image: tensorzero/gateway
    volumes:
      - ./config:/app/config:ro
    command: --config-file /app/config/tensorzero.toml
    environment:
      - AWS_ACCESS_KEY_ID=${AWS_ACCESS_KEY_ID:?Environment variable AWS_ACCESS_KEY_ID must be set.}
      - AWS_REGION=${AWS_REGION:?Environment variable AWS_REGION must be set.}
      - AWS_SECRET_ACCESS_KEY=${AWS_SECRET_ACCESS_KEY:?Environment variable AWS_SECRET_ACCESS_KEY must be set.}
    ports:
      - "3000:3000"
    extra_hosts:
      - "host.docker.internal:host-gateway"
You can start the gateway with docker compose up.

Inference

Make an inference request to the gateway:
curl -X POST http://localhost:3000/inference \
  -H "Content-Type: application/json" \
  -d '{
    "function_name": "my_function_name",
    "input": {
      "messages": [
        {
          "role": "user",
          "content": "What is the capital of Japan?"
        }
      ]
    }
  }'