Setup
For this minimal setup, you’ll need just two files in your project directory:Configuration
Create a minimal configuration file that defines a model and a simple chat function:config/tensorzero.toml
gcp_vertex_anthropic::model_name to use a GCP Vertex AI Anthropic model with TensorZero if you don’t need advanced features like fallbacks or custom credentials:
gcp_vertex_anthropic::projects/<PROJECT_ID>/locations/<REGION>/publishers/google/models/<MODEL_ID>gcp_vertex_anthropic::projects/<PROJECT_ID>/locations/<REGION>/endpoints/<ENDPOINT_ID>
Credentials
By default, TensorZero reads the path to your GCP service account JSON file from theGCP_VERTEX_CREDENTIALS_PATH environment variable (using path_from_env::GCP_VERTEX_CREDENTIALS_PATH).
You must generate a GCP service account key in JSON format as described here.
You can customize the credential location using:
sdk: use the Google Cloud SDK to auto-discover credentialspath::/path/to/credentials.json: use a specific file pathpath_from_env::YOUR_ENVIRONMENT_VARIABLE: read file path from an environment variable (default behavior)dynamic::ARGUMENT_NAME: provide credentials dynamically at inference time{ default = ..., fallback = ... }: configure credential fallbacks
Deployment (Docker Compose)
Create a minimal Docker Compose configuration:docker-compose.yml
docker compose up.