Simple Setup
You can use the short-handanthropic::model_name to use an Anthropic model with TensorZero, unless you need advanced features like fallbacks or custom credentials.
You can use Anthropic models in your TensorZero variants by setting the model field to anthropic::model_name.
For example:
model_name in the inference request to use a specific Anthropic model, without having to configure a function and variant in TensorZero.
Advanced Setup
In more complex scenarios (e.g. fallbacks, custom credentials), you can configure your own model and Anthropic provider in TensorZero. For this minimal setup, you’ll need just two files in your project directory:Configuration
Create a minimal configuration file that defines a model and a simple chat function:config/tensorzero.toml
Credentials
You must set theANTHROPIC_API_KEY environment variable before running the gateway.
You can customize the credential location by setting the api_key_location to env::YOUR_ENVIRONMENT_VARIABLE or dynamic::ARGUMENT_NAME.
See the Credential Management guide and Configuration Reference for more information.
Deployment (Docker Compose)
Create a minimal Docker Compose configuration:docker-compose.yml
docker compose up.
Inference
Make an inference request to the gateway:Enable Anthropic’s prompt caching capability
You can enable Anthropic’s prompt caching capability with TensorZero’sextra_body.
For example, to enable caching on your system prompt:
Use Anthropic models on third-party platforms
Use Anthropic models on AWS Bedrock
You can use Anthropic models on AWS Bedrock with theaws_bedrock model provider.
Use Anthropic models on Azure
You can use Anthropic models on Azure AI Foundry by overriding the API base in your configuration:Use Anthropic models on GCP Vertex AI
You can use Anthropic models on GCP Vertex AI with thegcp_vertex_anthropic model provider.