Learn how to use TensorZero with OpenAI-compatible LLMs: open-source gateway, observability, optimization, evaluations, and experimentation.
ollama serve
and that you’ve pulled the llama3.1
model in advance (e.g. ollama pull llama3.1
).
Make sure to update the api_base
and model_name
in the configuration below to match your OpenAI-compatible endpoint and model.
For this minimal setup, you’ll need just two files in your project directory:
api_key_location
field in your model provider configuration specifies how to handle API key authentication:
api_key_location = "none"
.
docker compose up
.