Setup
For this minimal setup, you’ll need just two files in your project directory:You can also find the complete code for this example on GitHub.
Configuration
Create a minimal configuration file that defines a model and a simple chat function:config/tensorzero.toml
endpoint = "env::AZURE_OPENAI_ENDPOINT"
to read from the environment variable AZURE_OPENAI_ENDPOINT
on startup or endpoint = "dynamic::azure_openai_endpoint"
to read from a dynamic credential azure_openai_endpoint
on each inference.
Credentials
You must set theAZURE_OPENAI_API_KEY
environment variable before running the gateway.
You can customize the credential location by setting the api_key_location
to env::YOUR_ENVIRONMENT_VARIABLE
or dynamic::ARGUMENT_NAME
.
See the Credential Management guide and Configuration Reference for more information.
Deployment (Docker Compose)
Create a minimal Docker Compose configuration:docker-compose.yml
docker compose up
.