Learn how to use TensorZero with DeepSeek LLMs: open-source gateway, observability, optimization, evaluations, and experimentation.
deepseek::model_name
to use a DeepSeek model with TensorZero, unless you need advanced features like fallbacks or custom credentials.
You can use DeepSeek models in your TensorZero variants by setting the model
field to deepseek::model_name
.
For example:
model_name
in the inference request to use a specific DeepSeek model, without having to configure a function and variant in TensorZero.
deepseek-chat
(DeepSeek-v3
) and deepseek-reasoner
(R1
).
DeepSeek only supports JSON mode for deepseek-chat
and neither model supports tool use yet.
We include thought
content blocks in the response and data model for reasoning models like deepseek-reasoner
.
DEEPSEEK_API_KEY
environment variable before running the gateway.
You can customize the credential location by setting the api_key_location
to env::YOUR_ENVIRONMENT_VARIABLE
or dynamic::ARGUMENT_NAME
.
See the Credential Management guide and Configuration Reference for more information.
docker compose up
.