Learn how to use TensorZero with self-hosted HuggingFace TGI LLMs: open-source gateway, observability, optimization, evaluations, experimentation.
api_base
in the configuration below to match your TGI server.
For this minimal setup, you’ll need just two files in your project directory:
api_key_location
field in your model provider configuration specifies how to handle API key authentication:
api_key_location = "none"
.
docker compose up
.