Learn how to use TensorZero with Mistral LLMs: open-source gateway, observability, optimization, evaluations, and experimentation.
mistral::model_name
to use a Mistral model with TensorZero, unless you need advanced features like fallbacks or custom credentials.
You can use Mistral models in your TensorZero variants by setting the model
field to mistral::model_name
.
For example:
model_name
in the inference request to use a specific Mistral model, without having to configure a function and variant in TensorZero.
MISTRAL_API_KEY
environment variable before running the gateway.
You can customize the credential location by setting the api_key_location
to env::YOUR_ENVIRONMENT_VARIABLE
or dynamic::ARGUMENT_NAME
.
See the Credential Management guide and Configuration Reference for more information.
docker compose up
.